Airbnb was started in 2008 by Brian Chesky and Joe Gebbia, and since then, it has gained popularity due to its low prices and direct interactions with the local community. Airbnb is an online home-sharing platform where people can list or rent properties for short-term use. Be it a spare bedroom, an apartment, a villa, a private island, or even a sofa, anyone looking to earn some profit can promote their space on Airbnb.
Airbnb provides its guests with many options and varieties such as the types of apartments, types of rentals, etc. but not a lot of functionality is provided to the hosts/homeowners to determine the optimum price for their listings. Airbnb listing price is a significant factor that hosts must get right. Especially in big cities like Amsterdam, where there is a lot of competition, even the slightest variations in the prices can get the listing priced out of the market. While listing out a property, Airbnb suggests a base price by considering property details, locations, and similar properties in the area. Some third-party services and websites provide general guidance, but none of them are free. Since the holiday letting market is constantly changing, hosts should be diligent in monitoring and adjusting their prices instead of only focusing on the base price initially set by Airbnb.
In this project, we think from the homeowner's perspective and try to determine the various factors that impact the listing prices most. By doing this, we help the hosts determine the optimum nightly price for their listings and maximize their earnings from the listings. We decided to work on the Airbnb data of one of the biggest cities, Amsterdam. Unlike other cities, Airbnb has not been in a favorable position in Amsterdam due to newly established renting regulations and close competition with apartments and other rental services. We will analyze multiple data sets collected from insideAirbnb, including parameters like availability, neighbors, room types, fees, and more. In addition, we will explore the guest reviews and impact of the reviews on Airbnb rentals to help hosts make better decisions.
● How does the accessibility to various amenities affect the price of Airbnb listings?
Amenities offered by the host play a significant role in the occupancy rate of a listing. Guests nowadays prefer to have basic amenities like Wifi, toiletries, heating, air conditioners, etc., to be included in the listing. We will analyze how the acceptable pricing varies depending on the number of amenities offered so that the host can determine the type of features or amenities he can add to his listing that will allow him to charge more.
● What are the various factors which affect the reviews? What insights can we gain from them?
Having positive reviews on a particular listing increases its probability of being accepted at a higher price by renters and also increases its occupancy rate. We will analyze which factors end up affecting reviews the most and how the host can effectively use them to improve the overall rating of their listing.
● Which areas have the most Airbnb properties, and which are the most expensive?
Locating the most popular rental spots is undoubtedly one of the essential factors for attracting more guests. Usually, there are popular tourist spots or locations near the center of cities where people will pay a higher price to stay. Not only can the host increase profits by charging a higher fee at these locations, but he can also tune the pricing based on the competition in these areas and create a separate customer base of his own.
● How can we help Airbnb hosts to determine the optimum nightly price for their listings?
Setting an optimum nightly rate according to market demand heavily influences the overall occupancy rate of a listing. The pricing strategy can vary depending on the type and characteristics of the listing. So we will use machine learning to help the host determine the most influential features of their listing and how they can optimize the pricing by leveraging this information.
● What are the various factors which affect the Airbnb listing?
This project focuses on helping an Airbnb host determine all the ways in which they can make the most profit out of either their existing listing or the factors that a potential host must consider before they publish a listing on Airbnb. This information can be extremely valuable to a host as it would help them understand which factors they should be focusing on more for their existing or potential listings.
The datasets used for this project were collected from the website Inside Airbnb, which is an independent, non-commercial, open source data tool. This investigatory website scrapes and reports publicly available information about a city's airbnb listings. The dataset used in this project was scraped on September 7th, 2021 and it contains information on all Amsterdam listings which were live on the website on that day.
We have used the following datasets from the website:
listings : This dataset contains a record for every listing published on the website for the city of Amsterdam. It contains important information such as listing neighbourhoods, property type, room time, amenities, price etc along with all the host information. This dataset has a total of 16117 rows and 74 columns. We mostly focus on this particular dataset since it has very detailed information of a particular listing that can be analyzed to make inferences.
reviews : This dataset contains information regarding guests and the reviews they have published on the website for the listings they have rented out. It has a total of 16117 rows and 6 columns. We also have review rating information of listings in the 'listing' dataset, for example average rating ,rating based on cleanliness, rating based on communication etc. We will use both of these dataset to analyze the impact of ratings.
calendar : This dataset contains information regarding the price and availibility of the listings published on the website. It has a total of 1048576 rows and 7 columns.
Even though the compiled data provides useful basis for examining and monitoring Airbnb practices, it has some critical limitations. The major one is that it only includes the advertised price. The sticker price is the overall nightly price that is advertised to potential guests, rather than the actual average amount paid per night by previous guests. The advertised prices can be set to any arbitrary amount by the host, and hosts that are less experienced with Airbnb will often set these to very low or very high.
The data that was obtained from Inside Airbnb contained a lot of information and needed to be transformed in order for the data to be useful for data visualization and machine learning. We performed the following operations on the data in order to prepare it for the same.
We will import all the required libraries before starting the analysis. Then we will load the listings data to a dataframe.
import pandas as pd
import re
import numpy as np
import matplotlib.pyplot as plt
import datetime as dt
import seaborn as sns
import plotly.graph_objects as go
import plotly.express as px
from sklearn.model_selection import train_test_split
from sklearn.linear_model import LinearRegression
import seaborn as sns
from sklearn.ensemble import RandomForestRegressor
from sklearn import metrics
from sklearn.model_selection import RandomizedSearchCV
from tqdm import tqdm_notebook as tqdm
from tensorflow.keras.layers import Dense, Activation
from tensorflow.keras.models import Sequential
from tensorflow import keras
path = 'E:\\Downloads\\Fall Classes\\Data Processing and Analysis in Python\\Project Proposal\\'
listings_full_df = pd.read_csv(path + 'listings.csv')
listings_full_df
| id | listing_url | scrape_id | last_scraped | name | description | neighborhood_overview | picture_url | host_id | host_url | ... | review_scores_communication | review_scores_location | review_scores_value | license | instant_bookable | calculated_host_listings_count | calculated_host_listings_count_entire_homes | calculated_host_listings_count_private_rooms | calculated_host_listings_count_shared_rooms | reviews_per_month | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2818 | https://www.airbnb.com/rooms/2818 | 20210907032724 | 2021-09-07 | Quiet Garden View Room & Super Fast WiFi | Quiet Garden View Room & Super Fast WiFi<br />... | Indische Buurt ("Indies Neighborhood") is a ne... | https://a0.muscache.com/pictures/10272854/8dcc... | 3159 | https://www.airbnb.com/users/show/3159 | ... | 4.97 | 4.68 | 4.81 | 0363 5F3A 5684 6750 D14D | t | 1 | 0 | 1 | 0 | 2.86 |
| 1 | 20168 | https://www.airbnb.com/rooms/20168 | 20210907032724 | 2021-09-07 | Studio with private bathroom in the centre 1 | 17th century Dutch townhouse in the heart of t... | Located just in between famous central canals.... | https://a0.muscache.com/pictures/69979628/fd6a... | 59484 | https://www.airbnb.com/users/show/59484 | ... | 4.62 | 4.87 | 4.49 | 0363 CBB3 2C10 0C2A 1E29 | t | 2 | 0 | 2 | 0 | 3.64 |
| 2 | 25428 | https://www.airbnb.com/rooms/25428 | 20210907032724 | 2021-09-07 | Lovely, sunny 1 bed apt in Ctr (w.lift) & firepl. | Lovely apt in Centre ( lift & fireplace) near ... | NaN | https://a0.muscache.com/pictures/138431/7079a9... | 56142 | https://www.airbnb.com/users/show/56142 | ... | 5.00 | 5.00 | 4.80 | NaN | f | 1 | 1 | 0 | 0 | 0.11 |
| 3 | 27886 | https://www.airbnb.com/rooms/27886 | 20210907032724 | 2021-09-07 | Romantic, stylish B&B houseboat in canal district | Stylish and romantic houseboat on fantastic hi... | Central, quiet, safe, clean and beautiful. | https://a0.muscache.com/pictures/02c2da9d-660e... | 97647 | https://www.airbnb.com/users/show/97647 | ... | 4.92 | 4.90 | 4.80 | 0363 974D 4986 7411 88D8 | t | 1 | 0 | 1 | 0 | 2.14 |
| 4 | 28871 | https://www.airbnb.com/rooms/28871 | 20210907032724 | 2021-09-08 | Comfortable double room | <b>The space</b><br />In a monumental house ri... | Flower market , Leidseplein , Rembrantsplein | https://a0.muscache.com/pictures/160889/362340... | 124245 | https://www.airbnb.com/users/show/124245 | ... | 4.94 | 4.97 | 4.82 | 0363 607B EA74 0BD8 2F6F | f | 2 | 0 | 2 | 0 | 4.59 |
| ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... |
| 16111 | 52001423 | https://www.airbnb.com/rooms/52001423 | 20210907032724 | 2021-09-07 | Relaxed App Next to Amsterdam Canals | (Low price because it is just listed!). This... | This neighborhood is amazing! It is hip and 3... | https://a0.muscache.com/pictures/miso/Hosting-... | 380653922 | https://www.airbnb.com/users/show/380653922 | ... | 5.00 | 5.00 | 5.00 | 0363 0EDB 8E1C E308 ECA2 | f | 1 | 1 | 0 | 0 | 1.00 |
| 16112 | 52016670 | https://www.airbnb.com/rooms/52016670 | 20210907032724 | 2021-09-08 | Penthouse most beautiful apartment of Amsterdam | When the elevator open the doors in the middle... | Best location ever. City on walking distance b... | https://a0.muscache.com/pictures/02141f9f-a486... | 391646360 | https://www.airbnb.com/users/show/391646360 | ... | NaN | NaN | NaN | 0363 8BD3 58E3 BF60 6E91 | f | 1 | 1 | 0 | 0 | NaN |
| 16113 | 52018685 | https://www.airbnb.com/rooms/52018685 | 20210907032724 | 2021-09-07 | (A) Cannabis Friendly - The LUX PENTHOUSE -420X | LUXERY CANNABIS FRIENDLY PENTHOUSE<br /><br />... | NaN | https://a0.muscache.com/pictures/3f474f76-e7ee... | 178187873 | https://www.airbnb.com/users/show/178187873 | ... | NaN | NaN | NaN | Exempt | t | 19 | 19 | 0 | 0 | NaN |
| 16114 | 52050333 | https://www.airbnb.com/rooms/52050333 | 20210907032724 | 2021-09-08 | Bed & Breakfast op mooiste plek van Waterland | In deze unieke accommodatie zit je precies bij... | NaN | https://a0.muscache.com/pictures/miso/Hosting-... | 405106044 | https://www.airbnb.com/users/show/405106044 | ... | NaN | NaN | NaN | NaN | t | 1 | 0 | 1 | 0 | NaN |
| 16115 | 52082799 | https://www.airbnb.com/rooms/52082799 | 20210907032724 | 2021-09-08 | Canal-side studio apartment in the Jordaan | Cosy studio apartment close in the Jordaan, th... | NaN | https://a0.muscache.com/pictures/miso/Hosting-... | 68023842 | https://www.airbnb.com/users/show/68023842 | ... | NaN | NaN | NaN | 0363 4512 09EE 2CEB 936F | f | 1 | 1 | 0 | 0 | NaN |
16116 rows × 74 columns
The dataset has 16116 listings and 74 columns for each listing.
Free text columns and other columns which are not useful in predicting price are being dropped, and the remaining columns are stored in a new dataframe listings_clean_df, keeping the original data intact in listings_full_df. We will also load the reviews data to a dataframe reviews_full_df, and calendar data in calendar_full_df.
listings_col_to_drop = ['listing_url','scrape_id','last_scraped','name','description','neighborhood_overview','picture_url','host_url','host_name','host_location','host_about','host_acceptance_rate','host_thumbnail_url'
,'host_picture_url','host_verifications','calendar_updated','calendar_last_scraped','number_of_reviews_ltm','number_of_reviews_l30d','first_review','last_review','license','reviews_per_month','host_listings_count','host_has_profile_pic']
listings_clean_df = listings_full_df.drop(listings_col_to_drop, axis=1)
listings_clean_df
| id | host_id | host_since | host_response_time | host_response_rate | host_is_superhost | host_neighbourhood | host_total_listings_count | host_identity_verified | neighbourhood | ... | review_scores_cleanliness | review_scores_checkin | review_scores_communication | review_scores_location | review_scores_value | instant_bookable | calculated_host_listings_count | calculated_host_listings_count_entire_homes | calculated_host_listings_count_private_rooms | calculated_host_listings_count_shared_rooms | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2818 | 3159 | 2008-09-24 | within an hour | 100% | t | Indische Buurt | 1.0 | t | Amsterdam, North Holland, Netherlands | ... | 5.00 | 4.97 | 4.97 | 4.68 | 4.81 | t | 1 | 0 | 1 | 0 |
| 1 | 20168 | 59484 | 2009-12-02 | within an hour | 100% | f | Grachtengordel | 2.0 | t | Amsterdam, North Holland, Netherlands | ... | 4.79 | 4.63 | 4.62 | 4.87 | 4.49 | t | 2 | 0 | 2 | 0 |
| 2 | 25428 | 56142 | 2009-11-20 | NaN | NaN | t | Grachtengordel | 2.0 | f | NaN | ... | 5.00 | 5.00 | 5.00 | 5.00 | 4.80 | f | 1 | 1 | 0 | 0 |
| 3 | 27886 | 97647 | 2010-03-23 | within an hour | 86% | t | Westelijke Eilanden | 1.0 | t | Amsterdam, North Holland, Netherlands | ... | 4.96 | 4.95 | 4.92 | 4.90 | 4.80 | t | 1 | 0 | 1 | 0 |
| 4 | 28871 | 124245 | 2010-05-13 | within an hour | 100% | t | Amsterdam Centrum | 2.0 | t | Amsterdam, North Holland, Netherlands | ... | 4.89 | 4.97 | 4.94 | 4.97 | 4.82 | f | 2 | 0 | 2 | 0 |
| ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... |
| 16111 | 52001423 | 380653922 | 2020-12-18 | within an hour | 100% | f | NaN | 0.0 | t | Amsterdam, Noord-Holland, Netherlands | ... | 5.00 | 5.00 | 5.00 | 5.00 | 5.00 | f | 1 | 1 | 0 | 0 |
| 16112 | 52016670 | 391646360 | 2021-03-08 | within an hour | 100% | f | NaN | 0.0 | t | Amsterdam, Noord-Holland, Netherlands | ... | NaN | NaN | NaN | NaN | NaN | f | 1 | 1 | 0 | 0 |
| 16113 | 52018685 | 178187873 | 2018-03-13 | within an hour | 98% | f | Grachtengordel | 18.0 | t | NaN | ... | NaN | NaN | NaN | NaN | NaN | t | 19 | 19 | 0 | 0 |
| 16114 | 52050333 | 405106044 | 2021-06-03 | NaN | NaN | f | NaN | 0.0 | t | NaN | ... | NaN | NaN | NaN | NaN | NaN | t | 1 | 0 | 1 | 0 |
| 16115 | 52082799 | 68023842 | 2016-04-20 | within an hour | 100% | f | NaN | 0.0 | t | NaN | ... | NaN | NaN | NaN | NaN | NaN | f | 1 | 1 | 0 | 0 |
16116 rows × 49 columns
path = 'E:\\Downloads\\Fall Classes\\Data Processing and Analysis in Python\\Project Proposal\\'
reviews_full_df = pd.read_csv(path + 'reviews.csv')
reviews_full_df
| listing_id | id | date | reviewer_id | reviewer_name | comments | |
|---|---|---|---|---|---|---|
| 0 | 2818 | 1191 | 2009-03-30 | 10952 | Lam | Daniel is really cool. The place was nice and ... |
| 1 | 2818 | 1771 | 2009-04-24 | 12798 | Alice | Daniel is the most amazing host! His place is ... |
| 2 | 2818 | 1989 | 2009-05-03 | 11869 | Natalja | We had such a great time in Amsterdam. Daniel ... |
| 3 | 2818 | 2797 | 2009-05-18 | 14064 | Enrique | Very professional operation. Room is very clea... |
| 4 | 2818 | 3151 | 2009-05-25 | 17977 | Sherwin | Daniel is highly recommended. He provided all... |
| ... | ... | ... | ... | ... | ... | ... |
| 397180 | 51758869 | 436665629200717769 | 2021-08-25 | 69375083 | Tyler | Great location, good value |
| 397181 | 51790429 | 444669151630245473 | 2021-09-05 | 99250332 | Michelle | My Mother and i made a little weekend trip to ... |
| 397182 | 51937953 | 445370102503196595 | 2021-09-06 | 98959171 | Alicia | El piso está bien, para dormir genial y puedes... |
| 397183 | 51938910 | 446186310752632995 | 2021-09-07 | 80210416 | Niklas | Everything was fine! The apartment is really n... |
| 397184 | 52001423 | 445428094112018905 | 2021-09-06 | 388702503 | Jessica | The apartment was really nice for a weekend st... |
397185 rows × 6 columns
We will drop the comments column in reviews since we will not be using NLP.
reviews_clean_df = reviews_full_df.drop('comments', axis=1)
reviews_clean_df
| listing_id | id | date | reviewer_id | reviewer_name | |
|---|---|---|---|---|---|
| 0 | 2818 | 1191 | 2009-03-30 | 10952 | Lam |
| 1 | 2818 | 1771 | 2009-04-24 | 12798 | Alice |
| 2 | 2818 | 1989 | 2009-05-03 | 11869 | Natalja |
| 3 | 2818 | 2797 | 2009-05-18 | 14064 | Enrique |
| 4 | 2818 | 3151 | 2009-05-25 | 17977 | Sherwin |
| ... | ... | ... | ... | ... | ... |
| 397180 | 51758869 | 436665629200717769 | 2021-08-25 | 69375083 | Tyler |
| 397181 | 51790429 | 444669151630245473 | 2021-09-05 | 99250332 | Michelle |
| 397182 | 51937953 | 445370102503196595 | 2021-09-06 | 98959171 | Alicia |
| 397183 | 51938910 | 446186310752632995 | 2021-09-07 | 80210416 | Niklas |
| 397184 | 52001423 | 445428094112018905 | 2021-09-06 | 388702503 | Jessica |
397185 rows × 5 columns
path = 'E:\\Downloads\\Fall Classes\\Data Processing and Analysis in Python\\Project Proposal\\'
calendar_full_df = pd.read_csv(path + 'calendar.csv')
calendar_full_df
| listing_id | date | available | price | adjusted_price | minimum_nights | maximum_nights | |
|---|---|---|---|---|---|---|---|
| 0 | 489418 | 2021-09-07 | f | $79.00 | $79.00 | 2.0 | 14.0 |
| 1 | 2818 | 2021-09-07 | f | $59.00 | $59.00 | 3.0 | 1125.0 |
| 2 | 2818 | 2021-09-08 | f | $59.00 | $59.00 | 3.0 | 1125.0 |
| 3 | 2818 | 2021-09-09 | f | $59.00 | $59.00 | 3.0 | 1125.0 |
| 4 | 2818 | 2021-09-10 | f | $59.00 | $59.00 | 3.0 | 1125.0 |
| ... | ... | ... | ... | ... | ... | ... | ... |
| 5881259 | 51718422 | 2022-09-02 | f | $119.00 | $119.00 | 3.0 | 30.0 |
| 5881260 | 51718422 | 2022-09-03 | f | $119.00 | $119.00 | 3.0 | 30.0 |
| 5881261 | 51718422 | 2022-09-04 | f | $119.00 | $119.00 | 3.0 | 30.0 |
| 5881262 | 51718422 | 2022-09-05 | f | $119.00 | $119.00 | 3.0 | 30.0 |
| 5881263 | 51718422 | 2022-09-06 | f | $119.00 | $119.00 | 3.0 | 30.0 |
5881264 rows × 7 columns
We will check if any columns in the three dataframes has null values and if these values will impact our analysis. If the total number of null values in a column is significant, then we will drop the column.
listings_clean_df.isna().sum()
id 0 host_id 0 host_since 5 host_response_time 11080 host_response_rate 11080 host_is_superhost 5 host_neighbourhood 5713 host_total_listings_count 5 host_identity_verified 5 neighbourhood 5711 neighbourhood_cleansed 0 neighbourhood_group_cleansed 16116 latitude 0 longitude 0 property_type 0 room_type 0 accommodates 0 bathrooms 16116 bathrooms_text 21 bedrooms 898 beds 97 amenities 0 price 0 minimum_nights 0 maximum_nights 0 minimum_minimum_nights 3 maximum_minimum_nights 3 minimum_maximum_nights 3 maximum_maximum_nights 3 minimum_nights_avg_ntm 3 maximum_nights_avg_ntm 3 has_availability 0 availability_30 0 availability_60 0 availability_90 0 availability_365 0 number_of_reviews 0 review_scores_rating 2087 review_scores_accuracy 2301 review_scores_cleanliness 2300 review_scores_checkin 2309 review_scores_communication 2304 review_scores_location 2309 review_scores_value 2309 instant_bookable 0 calculated_host_listings_count 0 calculated_host_listings_count_entire_homes 0 calculated_host_listings_count_private_rooms 0 calculated_host_listings_count_shared_rooms 0 dtype: int64
Dropping neighbourhood_group_cleansed and bathrooms from listings since these columns have no data. We also have to drop the host_response_time and host_response_rate since they have almost 70% null values each, so it won't be useful to us in drawing any analysis.
listings_clean_df = listings_clean_df.drop(['neighbourhood_group_cleansed', 'bathrooms','host_response_time','host_response_rate'], axis=1)
listings_clean_df.head(3)
| id | host_id | host_since | host_is_superhost | host_neighbourhood | host_total_listings_count | host_identity_verified | neighbourhood | neighbourhood_cleansed | latitude | ... | review_scores_cleanliness | review_scores_checkin | review_scores_communication | review_scores_location | review_scores_value | instant_bookable | calculated_host_listings_count | calculated_host_listings_count_entire_homes | calculated_host_listings_count_private_rooms | calculated_host_listings_count_shared_rooms | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2818 | 3159 | 2008-09-24 | t | Indische Buurt | 1.0 | t | Amsterdam, North Holland, Netherlands | Oostelijk Havengebied - Indische Buurt | 52.36435 | ... | 5.00 | 4.97 | 4.97 | 4.68 | 4.81 | t | 1 | 0 | 1 | 0 |
| 1 | 20168 | 59484 | 2009-12-02 | f | Grachtengordel | 2.0 | t | Amsterdam, North Holland, Netherlands | Centrum-Oost | 52.36407 | ... | 4.79 | 4.63 | 4.62 | 4.87 | 4.49 | t | 2 | 0 | 2 | 0 |
| 2 | 25428 | 56142 | 2009-11-20 | t | Grachtengordel | 2.0 | f | NaN | Centrum-West | 52.37490 | ... | 5.00 | 5.00 | 5.00 | 5.00 | 4.80 | f | 1 | 1 | 0 | 0 |
3 rows × 45 columns
We will drop the calculated_host_listings_count_entire_homes, calculated_host_listings_count_private_rooms, calculated_host_listings_count_shared_rooms columns since they are parts of the calculated_host_listings_count. Calculated_host_listings_count is the sum of all these values.
listings_clean_df = listings_clean_df.drop(['calculated_host_listings_count_entire_homes', 'calculated_host_listings_count_private_rooms', 'calculated_host_listings_count_shared_rooms'], axis=1)
listings_clean_df.head(3)
| id | host_id | host_since | host_is_superhost | host_neighbourhood | host_total_listings_count | host_identity_verified | neighbourhood | neighbourhood_cleansed | latitude | ... | number_of_reviews | review_scores_rating | review_scores_accuracy | review_scores_cleanliness | review_scores_checkin | review_scores_communication | review_scores_location | review_scores_value | instant_bookable | calculated_host_listings_count | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2818 | 3159 | 2008-09-24 | t | Indische Buurt | 1.0 | t | Amsterdam, North Holland, Netherlands | Oostelijk Havengebied - Indische Buurt | 52.36435 | ... | 280 | 4.89 | 4.93 | 5.00 | 4.97 | 4.97 | 4.68 | 4.81 | t | 1 |
| 1 | 20168 | 59484 | 2009-12-02 | f | Grachtengordel | 2.0 | t | Amsterdam, North Holland, Netherlands | Centrum-Oost | 52.36407 | ... | 339 | 4.44 | 4.69 | 4.79 | 4.63 | 4.62 | 4.87 | 4.49 | t | 2 |
| 2 | 25428 | 56142 | 2009-11-20 | t | Grachtengordel | 2.0 | f | NaN | Centrum-West | 52.37490 | ... | 5 | 5.00 | 5.00 | 5.00 | 5.00 | 5.00 | 5.00 | 4.80 | f | 1 |
3 rows × 42 columns
There are multiple columns with data on the minimum and maximum number of nights a guest is allowed to stay - minimum_nights, maximum_nights, minimum_minimum_nights, maximum_minimum_nights, minimum_maximum_nights, maximum_maximum_nights, minimum_nights_avg_ntm and maximum_nights_avg_ntm. Among them, we will only be retaining the minimum_nights and maximum_nights columns as this data is sufficient for our analysis.
listings_clean_df = listings_clean_df.drop(['minimum_minimum_nights','maximum_minimum_nights','minimum_maximum_nights','maximum_maximum_nights','minimum_nights_avg_ntm','maximum_nights_avg_ntm'], axis=1)
listings_clean_df.head(3)
| id | host_id | host_since | host_is_superhost | host_neighbourhood | host_total_listings_count | host_identity_verified | neighbourhood | neighbourhood_cleansed | latitude | ... | number_of_reviews | review_scores_rating | review_scores_accuracy | review_scores_cleanliness | review_scores_checkin | review_scores_communication | review_scores_location | review_scores_value | instant_bookable | calculated_host_listings_count | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2818 | 3159 | 2008-09-24 | t | Indische Buurt | 1.0 | t | Amsterdam, North Holland, Netherlands | Oostelijk Havengebied - Indische Buurt | 52.36435 | ... | 280 | 4.89 | 4.93 | 5.00 | 4.97 | 4.97 | 4.68 | 4.81 | t | 1 |
| 1 | 20168 | 59484 | 2009-12-02 | f | Grachtengordel | 2.0 | t | Amsterdam, North Holland, Netherlands | Centrum-Oost | 52.36407 | ... | 339 | 4.44 | 4.69 | 4.79 | 4.63 | 4.62 | 4.87 | 4.49 | t | 2 |
| 2 | 25428 | 56142 | 2009-11-20 | t | Grachtengordel | 2.0 | f | NaN | Centrum-West | 52.37490 | ... | 5 | 5.00 | 5.00 | 5.00 | 5.00 | 5.00 | 5.00 | 4.80 | f | 1 |
3 rows × 36 columns
Some columns have true or false values. We will convert this categorical data and assign numeric values to them. To understand the distribution of data, we are creating histograms. We will then drop the columns which contain only one value since these columns will not impact our analysis.
# Replacing columns with f/t with 0/1
listings_clean_df.replace({'f': 0, 't': 1}, inplace=True)
# Plotting the distribution of numerical and boolean categories
listings_clean_df.hist(figsize=(20,20), color=['#ff8080']);
property types¶In the listing data, there are multiple property types assigned to the listings. Some of the same property types are considered to be different because of the format of the data, and some entries also include the room types (ex. Private room in rental unit). We will remove the room types and assign new categories to the property types. Some of the entries do not have any specific property type, so they are changed to 'unknown'.
listings_clean_df['property_type']
0 Private room in rental unit
1 Private room in townhouse
2 Entire rental unit
3 Private room in houseboat
4 Private room in rental unit
...
16111 Entire rental unit
16112 Entire rental unit
16113 Entire rental unit
16114 Private room in bed and breakfast
16115 Entire condominium (condo)
Name: property_type, Length: 16116, dtype: object
listings_clean_df['property_type']
#rename one column and extract the property part
listings_clean_df['property_type1'] = listings_clean_df['property_type'].str.split(' in ').str[-1]
#unify all type of properties
listings_clean_df['property_type1'] = listings_clean_df['property_type1'].str.lower()
#remove the part which is overlapped with other columns
listings_clean_df['property_type1'] = listings_clean_df['property_type1'].str.replace('entire','').str.replace('tiny','')
#remove the unrelated property
listings_clean_df['property_type1'] = listings_clean_df['property_type1'].str.replace('bed and breakfast','unknown')
#private room doesnot present the property ,so transfer it into unknown
listings_clean_df['property_type1'] = listings_clean_df['property_type1'].str.replace('private room','unknown').str.strip()
listings_clean_df['property_type1'].unique()
array(['rental unit', 'townhouse', 'houseboat', 'guest suite', 'boat',
'unknown', 'residential home', 'loft', 'guesthouse',
'boutique hotel', 'condominium (condo)', 'serviced apartment',
'farm stay', 'chalet', 'bungalow', 'island', 'villa', 'house',
'barn', 'cabin', 'hotel', 'cottage', 'place', 'aparthotel',
'floor', 'dome house', 'earth house', 'home/apt', 'nature lodge',
'casa particular', 'campsite', 'hostel', 'yurt', 'bus', 'tipi',
'camper/rv', 'cave', 'tower'], dtype=object)
#recategorizing the property types
listings_clean_df.property_type1.replace({
'rental unit':'house',
'townhouse':'house',
'houseboat':'boat',
'guest suite':'suite',
'boat':'boat',
'residential home':'house',
'guesthouse':'hotel',
'boutique hotel':'hotel',
'condominium (condo)':'apartment',
'serviced apartment':'apartment',
'chalet':'tiny house',
'bungalow':'tiny house',
'barn':'farm',
'cottage':'cabin',
'place':'house',
'aparthotel':'apartment',
'floor':'tiny house',
'dome house':'house',
'earth house':'house',
'home/apt':'apartment',
'nature lodge':'cabin',
'casa particular':'particular',
'campsite':'camp',
'hostel':'hotel',
'yurt':'camp',
'bus':'bus',
'tipi':'camp',
'camper/rv':'camp',
'cave':'cave',
'tower':'tower'
},inplace = True)
listings_clean_df.head()
| id | host_id | host_since | host_is_superhost | host_neighbourhood | host_total_listings_count | host_identity_verified | neighbourhood | neighbourhood_cleansed | latitude | ... | review_scores_rating | review_scores_accuracy | review_scores_cleanliness | review_scores_checkin | review_scores_communication | review_scores_location | review_scores_value | instant_bookable | calculated_host_listings_count | property_type1 | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2818 | 3159 | 2008-09-24 | 1.0 | Indische Buurt | 1.0 | 1.0 | Amsterdam, North Holland, Netherlands | Oostelijk Havengebied - Indische Buurt | 52.36435 | ... | 4.89 | 4.93 | 5.00 | 4.97 | 4.97 | 4.68 | 4.81 | 1 | 1 | house |
| 1 | 20168 | 59484 | 2009-12-02 | 0.0 | Grachtengordel | 2.0 | 1.0 | Amsterdam, North Holland, Netherlands | Centrum-Oost | 52.36407 | ... | 4.44 | 4.69 | 4.79 | 4.63 | 4.62 | 4.87 | 4.49 | 1 | 2 | house |
| 2 | 25428 | 56142 | 2009-11-20 | 1.0 | Grachtengordel | 2.0 | 0.0 | NaN | Centrum-West | 52.37490 | ... | 5.00 | 5.00 | 5.00 | 5.00 | 5.00 | 5.00 | 4.80 | 0 | 1 | house |
| 3 | 27886 | 97647 | 2010-03-23 | 1.0 | Westelijke Eilanden | 1.0 | 1.0 | Amsterdam, North Holland, Netherlands | Centrum-West | 52.38761 | ... | 4.95 | 4.93 | 4.96 | 4.95 | 4.92 | 4.90 | 4.80 | 1 | 1 | boat |
| 4 | 28871 | 124245 | 2010-05-13 | 1.0 | Amsterdam Centrum | 2.0 | 1.0 | Amsterdam, North Holland, Netherlands | Centrum-West | 52.36775 | ... | 4.87 | 4.94 | 4.89 | 4.97 | 4.94 | 4.97 | 4.82 | 0 | 2 | house |
5 rows × 37 columns
host history¶All the data in the 'host_since' column should be dates. We will verify the pattern using regex.
#matching pattern using regex and storing data with different format wrong_format
pattern = '\d{4}-\d{1,2}-\d{1,2}'
wrong_format = listings_clean_df[listings_clean_df['host_since'].str.match(pattern) == False]
#get the number of rows
wrong_format.shape[0]
0
listings_clean_df['host_since'].isna().sum()
5
Since there were no data in the wrong format, we can transfer the column into datetime format. In the following snippet, we have converted the column into a measure of the number of the days that the host has been on the platform. The data was scraped at 2021/09/07, so the active days of host will be calculated upto this date.
# Convert to datetime
listings_clean_df.host_since = pd.to_datetime(listings_clean_df.host_since)
# Calculate the number of days
scrape_dt = dt.datetime(year=2021,day=7,month=9)
listings_clean_df['host_days_active'] = (scrape_dt - listings_clean_df.host_since)
listings_clean_df.head(5)
| id | host_id | host_since | host_is_superhost | host_neighbourhood | host_total_listings_count | host_identity_verified | neighbourhood | neighbourhood_cleansed | latitude | ... | review_scores_accuracy | review_scores_cleanliness | review_scores_checkin | review_scores_communication | review_scores_location | review_scores_value | instant_bookable | calculated_host_listings_count | property_type1 | host_days_active | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2818 | 3159 | 2008-09-24 | 1.0 | Indische Buurt | 1.0 | 1.0 | Amsterdam, North Holland, Netherlands | Oostelijk Havengebied - Indische Buurt | 52.36435 | ... | 4.93 | 5.00 | 4.97 | 4.97 | 4.68 | 4.81 | 1 | 1 | house | 4731 days |
| 1 | 20168 | 59484 | 2009-12-02 | 0.0 | Grachtengordel | 2.0 | 1.0 | Amsterdam, North Holland, Netherlands | Centrum-Oost | 52.36407 | ... | 4.69 | 4.79 | 4.63 | 4.62 | 4.87 | 4.49 | 1 | 2 | house | 4297 days |
| 2 | 25428 | 56142 | 2009-11-20 | 1.0 | Grachtengordel | 2.0 | 0.0 | NaN | Centrum-West | 52.37490 | ... | 5.00 | 5.00 | 5.00 | 5.00 | 5.00 | 4.80 | 0 | 1 | house | 4309 days |
| 3 | 27886 | 97647 | 2010-03-23 | 1.0 | Westelijke Eilanden | 1.0 | 1.0 | Amsterdam, North Holland, Netherlands | Centrum-West | 52.38761 | ... | 4.93 | 4.96 | 4.95 | 4.92 | 4.90 | 4.80 | 1 | 1 | boat | 4186 days |
| 4 | 28871 | 124245 | 2010-05-13 | 1.0 | Amsterdam Centrum | 2.0 | 1.0 | Amsterdam, North Holland, Netherlands | Centrum-West | 52.36775 | ... | 4.94 | 4.89 | 4.97 | 4.94 | 4.97 | 4.82 | 0 | 2 | house | 4135 days |
5 rows × 38 columns
bathroom_text column¶We found that the data in the bathroom_text column is in different formats where the number of bathrooms and type are combined together. Some of the numbers are in numeric format whereas others are descriptive. We will replace the descriptive values and then split them into separate columns with numbers and types of bathrooms.
listings_clean_df['bathrooms_text'].unique()
array(['1.5 shared baths', '1 private bath', '1 bath', '1.5 baths',
'1 shared bath', nan, '2 baths', '2.5 baths', '0 baths',
'Private half-bath', '3.5 baths', '3 baths', '4 shared baths',
'0 shared baths', 'Half-bath', '2 shared baths', '4 baths',
'3 shared baths', 'Shared half-bath', '6 baths', '8 baths',
'5 baths', '2.5 shared baths', '3.5 shared baths', '5.5 baths',
'13 baths', '4.5 baths'], dtype=object)
We will create a dictionary to format the data to "number shared/private bath/baths".
# half and shared are changed to 0.5 in the dictionary
translation = {'Private half-bath': '0.5 private baths', 'Half-bath': '0.5 baths', 'Shared half-bath': '0.5 shared baths'}
#bathrooms_text data are replaced from the values from translation dictionary
listings_clean_df['bathrooms_text_format'] = listings_clean_df['bathrooms_text'].replace(translation)
listings_clean_df['bathrooms_text_format'].unique()
array(['1.5 shared baths', '1 private bath', '1 bath', '1.5 baths',
'1 shared bath', nan, '2 baths', '2.5 baths', '0 baths',
'0.5 private baths', '3.5 baths', '3 baths', '4 shared baths',
'0 shared baths', '0.5 baths', '2 shared baths', '4 baths',
'3 shared baths', '0.5 shared baths', '6 baths', '8 baths',
'5 baths', '2.5 shared baths', '3.5 shared baths', '5.5 baths',
'13 baths', '4.5 baths'], dtype=object)
Splitting columns into bathrooms_number to store the number of the bathroom, and bathrooms_type to store the type of the bathroom, and replacing nan and no bathroom type data into unknown.
listings_clean_df['bathrooms_number'] = listings_clean_df['bathrooms_text_format'].str.split(' ').str[0]
listings_clean_df['bathrooms_type'] = listings_clean_df['bathrooms_text_format'].str.split(' ').str[1]
listings_clean_df['bathrooms_type'] = listings_clean_df['bathrooms_type'].replace(['bath','baths', np.nan], 'unknown')
listings_clean_df.head()
| id | host_id | host_since | host_is_superhost | host_neighbourhood | host_total_listings_count | host_identity_verified | neighbourhood | neighbourhood_cleansed | latitude | ... | review_scores_communication | review_scores_location | review_scores_value | instant_bookable | calculated_host_listings_count | property_type1 | host_days_active | bathrooms_text_format | bathrooms_number | bathrooms_type | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2818 | 3159 | 2008-09-24 | 1.0 | Indische Buurt | 1.0 | 1.0 | Amsterdam, North Holland, Netherlands | Oostelijk Havengebied - Indische Buurt | 52.36435 | ... | 4.97 | 4.68 | 4.81 | 1 | 1 | house | 4731 days | 1.5 shared baths | 1.5 | shared |
| 1 | 20168 | 59484 | 2009-12-02 | 0.0 | Grachtengordel | 2.0 | 1.0 | Amsterdam, North Holland, Netherlands | Centrum-Oost | 52.36407 | ... | 4.62 | 4.87 | 4.49 | 1 | 2 | house | 4297 days | 1 private bath | 1 | private |
| 2 | 25428 | 56142 | 2009-11-20 | 1.0 | Grachtengordel | 2.0 | 0.0 | NaN | Centrum-West | 52.37490 | ... | 5.00 | 5.00 | 4.80 | 0 | 1 | house | 4309 days | 1 bath | 1 | unknown |
| 3 | 27886 | 97647 | 2010-03-23 | 1.0 | Westelijke Eilanden | 1.0 | 1.0 | Amsterdam, North Holland, Netherlands | Centrum-West | 52.38761 | ... | 4.92 | 4.90 | 4.80 | 1 | 1 | boat | 4186 days | 1.5 baths | 1.5 | unknown |
| 4 | 28871 | 124245 | 2010-05-13 | 1.0 | Amsterdam Centrum | 2.0 | 1.0 | Amsterdam, North Holland, Netherlands | Centrum-West | 52.36775 | ... | 4.94 | 4.97 | 4.82 | 0 | 2 | house | 4135 days | 1 shared bath | 1 | shared |
5 rows × 41 columns
amenities available in listings¶# create a new df to process amenities
df_amenities = listings_clean_df[['id','amenities','price']].copy()
# create a new column that contain list version amenities
df_amenities.loc[:,'list_amenities'] = df_amenities.amenities.apply(eval)
The following function returns a list of unique amenities in the series passed to it.
def to_1D(series):
return pd.Series([x for _list in series for x in _list])
# show the frequency of the amenities
(to_1D(df_amenities.list_amenities).value_counts()/df_amenities.shape[0])
Wifi 0.972388
Essentials 0.961343
Heating 0.946575
Kitchen 0.882911
Smoke alarm 0.803177
...
Rituals, organic/lush body soap 0.000062
TV with standard cable, Netflix, Chromecast 0.000062
Pelgrim oven 0.000062
Organic body soap 0.000062
65" HDTV with 0.000062
Length: 831, dtype: float64
In addition to the differences in the importance of amenities, there are also great differences in the frequency of amenities. Some amenities are almost necessary (extremely frequent), while others are very rare.
In this project, amenities are classified and extracted, and only relatively important amenities/amenity groups are taken out. At this stage, the importance of amenities does not come from any mathematical calculation but is based on common sense and daily inference. The extracted amenities will be further studied and screened in the next stage. Many too few or too many facility groups will not be used as meaningful price influencing factors because they do not provide sufficient diversity.
df_amenities.loc[df_amenities['amenities'].str.contains('Internet|wifi',flags=re.IGNORECASE), 'internet'] = True
df_amenities.loc[df_amenities['amenities'].str.contains('Amazon Prime Video|sound system|Apple TV|Game console|Netflix|HDTV|bluetooth',flags=re.IGNORECASE), 'recreation'] = True
df_amenities.loc[df_amenities['amenities'].str.contains('Air conditioning',flags=re.IGNORECASE), 'air_conditioning'] = True
df_amenities.loc[df_amenities['amenities'].str.contains('Cooking|Kitchen|oven|Stove',flags=re.IGNORECASE), 'cook'] = True
df_amenities.loc[df_amenities['amenities'].str.contains('Elevator',flags=re.IGNORECASE), 'elevator'] = True
df_amenities.loc[df_amenities['amenities'].str.contains('Dedicated workspace',flags=re.IGNORECASE), 'work'] = True
df_amenities.loc[df_amenities['amenities'].str.contains('Gym|Fitness center',flags=re.IGNORECASE), 'gym'] = True
df_amenities.loc[df_amenities['amenities'].str.contains('Parking',flags=re.IGNORECASE), 'parking'] = True
df_amenities.loc[df_amenities['amenities'].str.contains('Children',flags=re.IGNORECASE), 'children'] = True
df_amenities.loc[df_amenities['amenities'].str.contains('Safe|Security system|Smart lock',flags=re.IGNORECASE), 'secure'] = True
df_amenities.loc[df_amenities['amenities'].str.contains('breakfast',flags=re.IGNORECASE), 'breakfast'] = True
df_amenities.fillna(False,inplace = True)
df_amenities.head()
| id | amenities | price | list_amenities | internet | recreation | air_conditioning | cook | elevator | work | gym | parking | children | secure | breakfast | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2818 | ["Single level home", "Coffee maker", "Long te... | $59.00 | [Single level home, Coffee maker, Long term st... | True | False | False | False | False | True | False | True | False | False | False |
| 1 | 20168 | ["Hot water", "TV", "Hangers", "Essentials", "... | $106.00 | [Hot water, TV, Hangers, Essentials, Fire exti... | True | False | False | False | False | True | False | True | False | False | False |
| 2 | 25428 | ["Cable TV", "Coffee maker", "Long term stays ... | $125.00 | [Cable TV, Coffee maker, Long term stays allow... | True | False | False | True | True | True | False | False | False | False | False |
| 3 | 27886 | ["Coffee maker", "Long term stays allowed", "P... | $141.00 | [Coffee maker, Long term stays allowed, Patio ... | True | False | False | False | False | True | False | False | False | True | True |
| 4 | 28871 | ["Hot water", "Shampoo", "Dryer", "Hangers", "... | $75.00 | [Hot water, Shampoo, Dryer, Hangers, Coffee ma... | True | False | False | False | False | False | False | False | False | False | False |
df_amenities.iloc[:,4:-1].sum()/df_amenities.shape[0] # show frequency of each amenities
internet 0.978531 recreation 0.052246 air_conditioning 0.069434 cook 0.894205 elevator 0.095867 work 0.673306 gym 0.015078 parking 0.405994 children 0.097853 secure 0.034686 dtype: float64
It can be seen that more than 97% of properties are equipped with internet amenities, so the internet cannot provide effective difference information. Therefore, we do not take the internet column as a valid column and drop it.
df_amenities.drop(columns = 'internet',inplace = True)
We will create a new column called score, whose value represents the number of supporting facilities selected above. For example, if one's amenities only contains Elevator and Smart lock, its score will be 2.
df_amenities['score'] = df_amenities.loc[:,['recreation','air_conditioning','cook','secure','elevator','work','gym','parking','children','secure','breakfast']].sum(axis = 1)
listings_clean_df['amenities_score'] = df_amenities['score']
review ratings¶The listings without reviews will be replaced with 'no reviews'. The remaining ratings will be grouped into bins. To determine the useful bins we create histograms to display the distribution of ratings for all the review ratings columns.
# Checking the distribution of the review ratings columns
review_cols_to_plot = list(listings_clean_df.columns[listings_clean_df.columns.str.startswith("review_scores") == True])
fig = plt.figure(figsize=(12,8))
for i, col_name in enumerate(review_cols_to_plot):
ax = fig.add_subplot(3,3,i+1)
listings_clean_df[col_name].hist(bins=10,ax=ax, color=['#ff8080'])
ax.set_title(col_name)
fig.tight_layout()
plt.show()
From the above histograms we can see that most of the ratings are 4/5 or 5/5. Therefore, the ratings 4/5 and 5/5 will be separately grouped and all the remaining ratings will be grouped together into a single bin.
# Binning for all the review ratings columns
bins=[0, 3, 4, 5]
labels=['0-3/5', '4/5', '5/5']
na_label='no reviews'
for col in review_cols_to_plot:
listings_clean_df[col] = pd.cut(listings_clean_df[col], bins=bins, labels=labels, include_lowest=True)
listings_clean_df[col] = listings_clean_df[col].astype('str')
listings_clean_df[col].fillna(na_label, inplace=True)
listings_clean_df
| id | host_id | host_since | host_is_superhost | host_neighbourhood | host_total_listings_count | host_identity_verified | neighbourhood | neighbourhood_cleansed | latitude | ... | review_scores_location | review_scores_value | instant_bookable | calculated_host_listings_count | property_type1 | host_days_active | bathrooms_text_format | bathrooms_number | bathrooms_type | amenities_score | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2818 | 3159 | 2008-09-24 | 1.0 | Indische Buurt | 1.0 | 1.0 | Amsterdam, North Holland, Netherlands | Oostelijk Havengebied - Indische Buurt | 52.364350 | ... | 5/5 | 5/5 | 1 | 1 | house | 4731 days | 1.5 shared baths | 1.5 | shared | 2 |
| 1 | 20168 | 59484 | 2009-12-02 | 0.0 | Grachtengordel | 2.0 | 1.0 | Amsterdam, North Holland, Netherlands | Centrum-Oost | 52.364070 | ... | 5/5 | 5/5 | 1 | 2 | house | 4297 days | 1 private bath | 1 | private | 2 |
| 2 | 25428 | 56142 | 2009-11-20 | 1.0 | Grachtengordel | 2.0 | 0.0 | NaN | Centrum-West | 52.374900 | ... | 5/5 | 5/5 | 0 | 1 | house | 4309 days | 1 bath | 1 | unknown | 3 |
| 3 | 27886 | 97647 | 2010-03-23 | 1.0 | Westelijke Eilanden | 1.0 | 1.0 | Amsterdam, North Holland, Netherlands | Centrum-West | 52.387610 | ... | 5/5 | 5/5 | 1 | 1 | boat | 4186 days | 1.5 baths | 1.5 | unknown | 4 |
| 4 | 28871 | 124245 | 2010-05-13 | 1.0 | Amsterdam Centrum | 2.0 | 1.0 | Amsterdam, North Holland, Netherlands | Centrum-West | 52.367750 | ... | 5/5 | 5/5 | 0 | 2 | house | 4135 days | 1 shared bath | 1 | shared | 0 |
| ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... |
| 16111 | 52001423 | 380653922 | 2020-12-18 | 0.0 | NaN | 0.0 | 1.0 | Amsterdam, Noord-Holland, Netherlands | De Baarsjes - Oud-West | 52.359864 | ... | 5/5 | 5/5 | 0 | 1 | house | 263 days | 1.5 baths | 1.5 | unknown | 2 |
| 16112 | 52016670 | 391646360 | 2021-03-08 | 0.0 | NaN | 0.0 | 1.0 | Amsterdam, Noord-Holland, Netherlands | Oud-Oost | 52.358310 | ... | no reviews | no reviews | 0 | 1 | house | 183 days | 2 baths | 2 | unknown | 8 |
| 16113 | 52018685 | 178187873 | 2018-03-13 | 0.0 | Grachtengordel | 18.0 | 1.0 | NaN | Centrum-West | 52.365632 | ... | no reviews | no reviews | 1 | 19 | house | 1274 days | 1.5 baths | 1.5 | unknown | 2 |
| 16114 | 52050333 | 405106044 | 2021-06-03 | 0.0 | NaN | 0.0 | 1.0 | NaN | Noord-Oost | 52.420204 | ... | no reviews | no reviews | 1 | 1 | unknown | 96 days | 1 private bath | 1 | private | 4 |
| 16115 | 52082799 | 68023842 | 2016-04-20 | 0.0 | NaN | 0.0 | 1.0 | NaN | Westerpark | 52.377240 | ... | no reviews | no reviews | 0 | 1 | apartment | 1966 days | 1 bath | 1 | unknown | 1 |
16116 rows × 42 columns
Below is the list of all columns and a short description of the data each column contains from the final processed dataframe.
id - Airbnb's unique identifier for the listing.host_id - Airbnb's unique identifier for the host/user.host_since - The date the host/user was created. For hosts that are Airbnb guests this could be the date they registered as a guest.host_is_superhost - Boolean value to indicate if the host is a superhost or not.host_neighbourhood - The neighbourhood that the host resides in.host_total_listings_count - The number of listings the host has (per Airbnb calculations).host_identity_verified - Boolean value to indicate if the hosts identity has been verified.neighbourhood - The neighbourhood that the listing is located in.neighbourhood_cleansed - The neighbourhood as geocoded using the latitude and longitude against neighborhoods as defined by open or public digital shapefiles.latitude - Uses the World Geodetic System (WGS84) projection for latitude and longitude.longitude - Uses the World Geodetic System (WGS84) projection for latitude and longitude.property_type - Self selected property type. Hotels and Bed and Breakfasts are described as such by their hosts in this field.room_type - All homes are grouped into the following room types: Entire place, Private room, Shared room, Entire place.accommodates - The maximum capacity of the listing.bathrooms_text - The type of bathrooms present within the listing.bedrooms - The number of bedrooms.beds - The number of bed(s).amenities - A json list of all the amenities provided by the listing.price - daily price in local currency.minimum_nights - minimum number of night stay for the listing (calendar rules may be different).maximum_nights - maximum number of night stay for the listing (calendar rules may be different).has_availability - Boolean value to indicate whether the listing is available for renting out or not. availability_30, availability_60, availability_90, availability_365 - avaliability_x. The availability of the listing x days in the future as determined by the calendar. Note a listing may not be available because it has been booked by a guest or blocked by the host.number_of_reviews - The number of reviews the listing has. review_scores_rating - The overall rating that the listing has received.review_scores_accuracy - The rating that the listing has received for accurate representation of the listing on the website. review_scores_cleanliness - The rating that the listing has received for accuracy.review_scores_checkin - The rating that the listing has received for a smooth check in experience.review_scores_communication - The rating that the listing has received for the communication with the host.review_scores_location - The rating that the listing has received for its location.review_scores_value - The rating that the listing has received for its value for money.instant_bookable - Whether the guest can automatically book the listing without the host requiring to accept their booking request. An indicator of a commercial listing.calculated_host_listings_count - The number of listings the host has in the current scrape, in the city/region geography.We think that the number of amenities and the type of the amenity may be a factor that affect price. Perhaps a host can charge a higher price if he provided a gym. Or perhaps a host can charge higher if he provided lots of amenities.
Change the data type of price from string to float. First we need to remove the '$' Since some number contains ',', for example : 8,000, so we need to remove ','. Name the new column to 'price_format'.
#Split the price to $ and number. Get only the number and store it to price_format
listings_clean_df['price_format'] = listings_clean_df['price'].str.split('$').str[1]
#filter out ',' in the string
listings_clean_df['price_format'] = listings_clean_df['price_format'].str.replace(',', '')
#change data type to numeric
listings_clean_df['price_format'] = pd.to_numeric(listings_clean_df['price_format'])
To check the distribution of the price, we use box plot to plot the price_format. By running the comment below, we can find out that the overall price were around 0~1000, and there were a lot of outliers. So we use where function to filter the price. We consider that the price over 500 is high, so we change the price over 500 to 500 to generate more meaningful plots.
#plt.boxplot(listings_clean_df['price_format'])
# replace price to 500 for the price higher than 500, otherwise, keep the same price.
listings_clean_df['price_filter'] = np.where((listings_clean_df.price_format > 500), 500, listings_clean_df.price_format)
To check the performance, we plot the boxplot again.
#plt.boxplot(listings_clean_df['price_filter'])
Now we did the same thing on price for the df_amenities dataframe.
#Split the price to $ and number. Get only the number and store it to price_format
df_amenities['price_format'] = df_amenities['price'].str.split('$').str[1]
#filter out ',' in the string
df_amenities['price_format'] = df_amenities['price_format'].str.replace(',', '')
#change data type to numeric
df_amenities['price_format'] = pd.to_numeric(df_amenities['price_format'])
# replace price to 500 for the price higher than 500, otherwise, keep the same price.
df_amenities['price_filter'] = np.where((df_amenities.price_format > 500), 500, df_amenities.price_format)
To see whether having the amenity affects price, we used a for loop to generate box plot for each amenity. We can see the result by running the comments below.
# amenities_cols_to_plot = list(df_amenities.iloc[:,4:-2].columns)
# for i, col_name in enumerate(amenities_cols_to_plot):
# sns.catplot(x=col_name, y="price_filter", kind="box", data=df_amenities)
In the above plots, we can easily find out whether having the specific amenity result in higher prices by comparing the True and False boxes. The higher the box is, the higher the prices was.
We can conclude that: The amenities with higher prices: recreation, air condition, cook, work, gym, parking, childern, secure
The amenities that do not have higher prices: elevator, breakfast
So we can suggest that if the listing has recreation, air condition, cook, work, gym, parking, childer or secure, the host can place the price higher.
Also, amoung these amenities, having secure and children has a bigger difference with not having them. So if a host wants to open a new listing and he can provide these amenities, he can price it more higher.
On the other hand, we find out that the price between having breakfest and not having breakfest were similar. So we suggest the host to not provide breakfest because it is costly.
We think that having more amenities may a a higher price. To check this, we generated a box plot that shows the relationship between Number of Amenities and Price.
sns.catplot(x='score', y="price_filter", kind="box", data=df_amenities, height=8, aspect=2, palette='flare')
#sns.color_palette("rocket")
plt.xlabel("Number of Amenities",fontsize=15)
plt.ylabel("Price",fontsize=15)
plt.title("Number of Amenities VS Price Box Plot",fontsize=18)
plt.show()
We found out that while the number of amenities gets higher, the box also gets higher. This means that a host can charge a higher price if they provided lots of amenities. But when we look at score 9, it has lower prices. So we need to take a look at the data.
df_amenities[df_amenities['score'] == 9]
| id | amenities | price | list_amenities | recreation | air_conditioning | cook | elevator | work | gym | parking | children | secure | breakfast | score | price_format | price_filter | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 4470 | 11620756 | ["Cable TV", "Single level home", "Coffee make... | $71.00 | [Cable TV, Single level home, Coffee maker, Ka... | True | False | True | True | True | True | True | False | True | True | 9 | 71.0 | 71.0 |
| 14575 | 41688975 | ["Coffee maker", "Long term stays allowed", "I... | $110.00 | [Coffee maker, Long term stays allowed, Indoor... | True | True | True | False | True | True | True | False | True | True | 9 | 110.0 | 110.0 |
Since there were only two data points that has nine amenities, we consider that maybe there were some other factors that were stronger than the number of ameties the affects the price.
Most people look at the reviews of a listing before making the final decision. So we will try to get an overview of how the reviews are distributed, if there is a any noticeable pattern in how guests rate a listing and get insights from these to help the host.
If we look at the distribution of the feature 'review_scores_rating', which is the overall rating of a particular listing, we can see that most of the ratings lie in 5, which translates to having 80% or more rating. So people have good experiences with the airbnbs so far in Amsterdam.
#making a copy of the working dataframe for review analysis
review_analysis_df = listings_clean_df.copy()
#plotting the review scores in a bar plot
fig, ax = plt.subplots(1, 1, figsize=(8,5))
ax.set_title('Rating Distribution of Listings', fontsize=14)
review_analysis_df['review_scores_rating'].value_counts().sort_index(ascending=False).plot(kind='bar', color=['#ff8080', '#ff9999','#ffb3b3', '#ffcccc' ], ax=ax)
ax.set_xticklabels(labels=['no reviews', '5', '4', '0-3'], rotation=0)
ax.set_xlabel('Ratings')
ax.set_ylabel('Number of properties', fontsize=13)
plt.show()
We will look into the reviews based on individual criteria to see if any meaningful insight can be extracted from them. There are a total of seven review columns, based on the description accuracy, cleanliness, checkin, communication, location, value for money, and the total rating. After plotting the rating columns, we can see that users comparatively give less positive feedbacks on cleanliness, value and location.
#Plotting the review columns
review_cols_to_plot = list(listings_clean_df.columns[listings_clean_df.columns.str.startswith("review_scores") == True])
fig = plt.figure(figsize=(20,8))
for i, col_name in enumerate(review_cols_to_plot):
ax = fig.add_subplot(2,4,i+1)
#review_analysis_df[col_name].hist(bins=10,ax=ax)
review_analysis_df[col].value_counts().plot(kind='barh', color=['#ff8080', '#ff9999','#ffb3b3', '#ffcccc' ])
ax.set_title(col_name)
ax.set_ylabel('Ratings')
ax.set_xlabel('Number of properties', fontsize=13)
fig.tight_layout()
plt.show()
mapbox_access_token = 'pk.eyJ1IjoiZmFiaWVubmV5YW5nIiwiYSI6ImNrdnZqejFlNDJzdmIydm1ucnRrMXhybXIifQ.yNoWRN125HKSsv-vux0lKg'
Use the price filter we created to plot a map. The darker the dot is, the higher the price is.
px.set_mapbox_access_token(mapbox_access_token)
fig = px.scatter_mapbox(listings_clean_df, lat="latitude", lon="longitude", color="price_filter",
color_continuous_scale="Sunset", size_max=15, zoom=10, mapbox_style="stamen-terrain")
fig.update_layout(margin={"r":0,"t":0,"l":0,"b":0})
fig.show()
In the above plot, we found out that there are specific areas that has darker dots. Which means the area has higher price. The area that has higher price were: Amsterdam Marina Tolhuistuin Vondelpark Sportpark De Eendracht
Among these places, there is only a few listings in Sportpark De Edndracht. So we suggest the hosts to open up new listings in the area and place it in a higher price.
listings_clean_df.drop(columns = ['price_format','price_filter'],inplace = True)
Since the prediction target price is a continuous and specific value, we should use regression model other than classification model.
listings_clean_df.isnull().any()
id False host_id False host_since True host_is_superhost True host_neighbourhood True host_total_listings_count True host_identity_verified True neighbourhood True neighbourhood_cleansed False latitude False longitude False property_type False room_type False accommodates False bathrooms_text True bedrooms True beds True amenities False price False minimum_nights False maximum_nights False has_availability False availability_30 False availability_60 False availability_90 False availability_365 False number_of_reviews False review_scores_rating False review_scores_accuracy False review_scores_cleanliness False review_scores_checkin False review_scores_communication False review_scores_location False review_scores_value False instant_bookable False calculated_host_listings_count False property_type1 False host_days_active True bathrooms_text_format True bathrooms_number True bathrooms_type False amenities_score False dtype: bool
listings_clean_df.price = listings_clean_df.price.str.replace('$','',regex = False).str.replace(',','').astype(float)
listings_clean_df.review_scores_value = listings_clean_df.review_scores_value.replace('no reviews',np.nan,regex = False).replace({'0-3/5':1, '4/5':2, '5/5':3}).astype(float)
listings_clean_df.review_scores_location = listings_clean_df.review_scores_location.replace('no reviews',np.nan,regex = False).replace({'0-3/5':1, '4/5':2, '5/5':3}).astype(float)
listings_clean_df.review_scores_communication = listings_clean_df.review_scores_communication.replace('no reviews',np.nan,regex = False).replace({'0-3/5':1, '4/5':2, '5/5':3}).astype(float)
listings_clean_df.review_scores_checkin = listings_clean_df.review_scores_checkin.replace('no reviews',np.nan,regex = False).replace({'0-3/5':1, '4/5':2, '5/5':3}).astype(float)
listings_clean_df.review_scores_cleanliness = listings_clean_df.review_scores_cleanliness.replace('no reviews',np.nan,regex = False).replace({'0-3/5':1, '4/5':2, '5/5':3}).astype(float)
listings_clean_df.review_scores_accuracy = listings_clean_df.review_scores_accuracy.replace('no reviews',np.nan,regex = False).replace({'0-3/5':1, '4/5':2, '5/5':3}).astype(float)
listings_clean_df.review_scores_rating = listings_clean_df.review_scores_rating.replace('no reviews',np.nan,regex = False).replace({'0-3/5':1, '4/5':2, '5/5':3}).astype(float)
numerical_columns = [col for col in listings_clean_df.columns if type(listings_clean_df.loc[0,col]) in [np.int64, float, np.float64]]
str_columns = [col for col in listings_clean_df.columns if type(listings_clean_df.loc[0,col]) is str]
Create a heatmap for correlation matrix, which shows higher correlation with lighter.
plt.figure(figsize = (10,7))
sns.heatmap(listings_clean_df.corr())
<AxesSubplot:>
selected_feature_df = listings_clean_df.drop(columns = ['id','host_id'])
selected_feature_df.drop(columns = ['host_since','host_neighbourhood','neighbourhood',
'bathrooms_text','host_days_active','property_type',
'amenities','bathrooms_text_format'],
inplace = True)
There are three lighter cluster on the above picture, which means there are three clusters of features and each of them has high correlation.
For the upper left cluster('accommodates', 'beds', 'bedrooms'), we keep 'accomodates'.
For the middle cluster('availability_30','availability_60','availability_90','availability_365'), we keep 'availability_365'.
For the bottom cluster('review_scores_accuracy', 'review_scores_cleanliness','review_scores_checkin', 'review_scores_communication', 'review_scores_value', 'review_scores_location'), we keep 'review_scores_value' and 'review_scores_location'.
selected_feature_df.drop(columns = ['availability_30','availability_60','availability_90','bedrooms','review_scores_rating',
'review_scores_accuracy', 'review_scores_cleanliness',
'review_scores_checkin', 'review_scores_communication'],inplace = True)
selected_feature_df.host_is_superhost.fillna(0,inplace = True)
selected_feature_df.bathrooms_number = listings_clean_df.bathrooms_number.fillna(0.0).astype(float)
selected_feature_df.host_identity_verified.fillna(0,inplace = True)
selected_feature_df.host_total_listings_count.fillna(1,inplace = True)
selected_feature_df.beds.fillna(listings_clean_df.beds.mean(),inplace = True)
selected_feature_df['review_scores_location'].fillna(selected_feature_df['review_scores_location'].mean(),inplace = True)
selected_feature_df.review_scores_value.fillna(listings_clean_df.review_scores_value.mean(),inplace = True)
selected_feature_df.columns
Index(['host_is_superhost', 'host_total_listings_count',
'host_identity_verified', 'neighbourhood_cleansed', 'latitude',
'longitude', 'room_type', 'accommodates', 'beds', 'price',
'minimum_nights', 'maximum_nights', 'has_availability',
'availability_365', 'number_of_reviews', 'review_scores_location',
'review_scores_value', 'instant_bookable',
'calculated_host_listings_count', 'property_type1', 'bathrooms_number',
'bathrooms_type', 'amenities_score'],
dtype='object')
Filter the dataset with price because if the price is 0, it is obvious abnormal and unmeaningful.
selected_feature_df = selected_feature_df[selected_feature_df['price'] != 0.0]
selected_feature_df.hist(figsize=(16,16),color='#FF8080')
array([[<AxesSubplot:title={'center':'host_is_superhost'}>,
<AxesSubplot:title={'center':'host_total_listings_count'}>,
<AxesSubplot:title={'center':'host_identity_verified'}>,
<AxesSubplot:title={'center':'latitude'}>],
[<AxesSubplot:title={'center':'longitude'}>,
<AxesSubplot:title={'center':'accommodates'}>,
<AxesSubplot:title={'center':'beds'}>,
<AxesSubplot:title={'center':'price'}>],
[<AxesSubplot:title={'center':'minimum_nights'}>,
<AxesSubplot:title={'center':'maximum_nights'}>,
<AxesSubplot:title={'center':'has_availability'}>,
<AxesSubplot:title={'center':'availability_365'}>],
[<AxesSubplot:title={'center':'number_of_reviews'}>,
<AxesSubplot:title={'center':'review_scores_location'}>,
<AxesSubplot:title={'center':'review_scores_value'}>,
<AxesSubplot:title={'center':'instant_bookable'}>],
[<AxesSubplot:title={'center':'calculated_host_listings_count'}>,
<AxesSubplot:title={'center':'bathrooms_number'}>,
<AxesSubplot:title={'center':'amenities_score'}>, <AxesSubplot:>]],
dtype=object)
Eliminate the right skewness with log transformation.
transformed_df = selected_feature_df.copy()
right_skew = ['host_total_listings_count','accommodates',
'beds','price','minimum_nights',
'availability_365',
'number_of_reviews','calculated_host_listings_count']
for col in right_skew:
transformed_df[col] = selected_feature_df[col].replace(0.0, 0.001) # Replacing 0s with 0.01
transformed_df[col] = np.log(transformed_df[col])
transformed_df.hist(figsize=(16,16),color='#FF8080')
array([[<AxesSubplot:title={'center':'host_is_superhost'}>,
<AxesSubplot:title={'center':'host_total_listings_count'}>,
<AxesSubplot:title={'center':'host_identity_verified'}>,
<AxesSubplot:title={'center':'latitude'}>],
[<AxesSubplot:title={'center':'longitude'}>,
<AxesSubplot:title={'center':'accommodates'}>,
<AxesSubplot:title={'center':'beds'}>,
<AxesSubplot:title={'center':'price'}>],
[<AxesSubplot:title={'center':'minimum_nights'}>,
<AxesSubplot:title={'center':'maximum_nights'}>,
<AxesSubplot:title={'center':'has_availability'}>,
<AxesSubplot:title={'center':'availability_365'}>],
[<AxesSubplot:title={'center':'number_of_reviews'}>,
<AxesSubplot:title={'center':'review_scores_location'}>,
<AxesSubplot:title={'center':'review_scores_value'}>,
<AxesSubplot:title={'center':'instant_bookable'}>],
[<AxesSubplot:title={'center':'calculated_host_listings_count'}>,
<AxesSubplot:title={'center':'bathrooms_number'}>,
<AxesSubplot:title={'center':'amenities_score'}>, <AxesSubplot:>]],
dtype=object)
transformed_df.isnull().any()
host_is_superhost False host_total_listings_count False host_identity_verified False neighbourhood_cleansed False latitude False longitude False room_type False accommodates False beds False price False minimum_nights False maximum_nights False has_availability False availability_365 False number_of_reviews False review_scores_location False review_scores_value False instant_bookable False calculated_host_listings_count False property_type1 False bathrooms_number False bathrooms_type False amenities_score False dtype: bool
transformed_df
| host_is_superhost | host_total_listings_count | host_identity_verified | neighbourhood_cleansed | latitude | longitude | room_type | accommodates | beds | price | ... | availability_365 | number_of_reviews | review_scores_location | review_scores_value | instant_bookable | calculated_host_listings_count | property_type1 | bathrooms_number | bathrooms_type | amenities_score | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 1.0 | 0.000000 | 1.0 | Oostelijk Havengebied - Indische Buurt | 52.364350 | 4.943580 | Private room | 0.693147 | 0.693147 | 4.077537 | ... | 4.820282 | 5.634790 | 3.000000 | 3.00000 | 1 | 0.000000 | house | 1.5 | shared | 2 |
| 1 | 0.0 | 0.693147 | 1.0 | Centrum-Oost | 52.364070 | 4.893930 | Private room | 0.693147 | 0.000000 | 4.663439 | ... | -6.907755 | 5.826000 | 3.000000 | 3.00000 | 1 | 0.693147 | house | 1.0 | private | 2 |
| 2 | 1.0 | 0.693147 | 0.0 | Centrum-West | 52.374900 | 4.884870 | Entire home/apt | 1.098612 | 0.000000 | 4.828314 | ... | 4.043051 | 1.609438 | 3.000000 | 3.00000 | 0 | 0.000000 | house | 1.0 | unknown | 3 |
| 3 | 1.0 | 0.000000 | 1.0 | Centrum-West | 52.387610 | 4.891880 | Private room | 0.693147 | 0.000000 | 4.948760 | ... | 4.189655 | 5.407172 | 3.000000 | 3.00000 | 1 | 0.000000 | boat | 1.5 | unknown | 4 |
| 4 | 1.0 | 0.693147 | 1.0 | Centrum-West | 52.367750 | 4.890920 | Private room | 0.693147 | 0.000000 | 4.317488 | ... | 5.697093 | 5.866468 | 3.000000 | 3.00000 | 0 | 0.693147 | house | 1.0 | shared | 0 |
| ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... |
| 16111 | 0.0 | -6.907755 | 1.0 | De Baarsjes - Oud-West | 52.359864 | 4.856234 | Entire home/apt | 1.098612 | 0.000000 | 4.605170 | ... | 3.784190 | 0.000000 | 3.000000 | 3.00000 | 0 | 0.000000 | house | 1.5 | unknown | 2 |
| 16112 | 0.0 | -6.907755 | 1.0 | Oud-Oost | 52.358310 | 4.905953 | Entire home/apt | 1.386294 | 1.098612 | 5.552960 | ... | 3.091042 | -6.907755 | 2.943507 | 2.90541 | 0 | 0.000000 | house | 2.0 | unknown | 8 |
| 16113 | 0.0 | 2.890372 | 1.0 | Centrum-West | 52.365632 | 4.881139 | Entire home/apt | 1.386294 | 0.693147 | 5.187386 | ... | 1.098612 | -6.907755 | 2.943507 | 2.90541 | 1 | 2.944439 | house | 1.5 | unknown | 2 |
| 16114 | 0.0 | -6.907755 | 1.0 | Noord-Oost | 52.420204 | 5.066508 | Private room | 0.693147 | 0.000000 | 4.477337 | ... | 5.899897 | -6.907755 | 2.943507 | 2.90541 | 1 | 0.000000 | unknown | 1.0 | private | 4 |
| 16115 | 0.0 | -6.907755 | 1.0 | Westerpark | 52.377240 | 4.877020 | Entire home/apt | 0.693147 | 0.000000 | 4.779123 | ... | -6.907755 | -6.907755 | 2.943507 | 2.90541 | 0 | 0.000000 | apartment | 1.0 | unknown | 1 |
16101 rows × 23 columns
A one-hot vector is a 1 × N matrix used to distinguish each word in a vocabulary. The vector consists of 0s in all cells with the exception of a single 1 in a cell used uniquely to identify the word.
for col in transformed_df.columns:
print(col,type(transformed_df[col][0]))
host_is_superhost <class 'numpy.float64'> host_total_listings_count <class 'numpy.float64'> host_identity_verified <class 'numpy.float64'> neighbourhood_cleansed <class 'str'> latitude <class 'numpy.float64'> longitude <class 'numpy.float64'> room_type <class 'str'> accommodates <class 'numpy.float64'> beds <class 'numpy.float64'> price <class 'numpy.float64'> minimum_nights <class 'numpy.float64'> maximum_nights <class 'numpy.int64'> has_availability <class 'numpy.int64'> availability_365 <class 'numpy.float64'> number_of_reviews <class 'numpy.float64'> review_scores_location <class 'numpy.float64'> review_scores_value <class 'numpy.float64'> instant_bookable <class 'numpy.int64'> calculated_host_listings_count <class 'numpy.float64'> property_type1 <class 'str'> bathrooms_number <class 'numpy.float64'> bathrooms_type <class 'str'> amenities_score <class 'numpy.int64'>
For those columns with category data type(str), we use onehot encoder to transform them into numerical values, so the algorithms can process them.
'''
for each features in parameter features,
encode it with pd.dummies,
bind the encoded result with original dataset,
remove the original feature in the dataset.
'''
def onehotEncode_and_bind(original_df, features):
res = original_df
for f in features:
# use dummies as encoder
dummies = pd.get_dummies(original_df[[f]])
res = pd.concat([res, dummies], axis=1)
res = res.drop([f], axis=1)
return res
encoded_df = onehotEncode_and_bind(transformed_df,['room_type','property_type1','bathrooms_type','neighbourhood_cleansed'])
X = encoded_df.drop('price', axis=1)
y = encoded_df.price
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=1)
X_train
| host_is_superhost | host_total_listings_count | host_identity_verified | latitude | longitude | accommodates | beds | minimum_nights | maximum_nights | has_availability | ... | neighbourhood_cleansed_Noord-Oost | neighbourhood_cleansed_Noord-West | neighbourhood_cleansed_Oostelijk Havengebied - Indische Buurt | neighbourhood_cleansed_Osdorp | neighbourhood_cleansed_Oud-Noord | neighbourhood_cleansed_Oud-Oost | neighbourhood_cleansed_Slotervaart | neighbourhood_cleansed_Watergraafsmeer | neighbourhood_cleansed_Westerpark | neighbourhood_cleansed_Zuid | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 4175 | 0.0 | 0.000000 | 1.0 | 52.36414 | 4.93734 | 0.693147 | 0.000000 | 0.693147 | 1125 | 1 | ... | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 11698 | 1.0 | 0.000000 | 0.0 | 52.37024 | 4.91196 | 0.693147 | 0.693147 | 0.000000 | 1125 | 1 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 6680 | 0.0 | 0.000000 | 0.0 | 52.35288 | 4.88433 | 1.386294 | 1.098612 | 0.693147 | 1125 | 1 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
| 7289 | 0.0 | 0.000000 | 1.0 | 52.34558 | 4.89679 | 1.386294 | 1.098612 | 1.098612 | 1125 | 1 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 15415 | 0.0 | -6.907755 | 0.0 | 52.36923 | 4.90604 | 2.079442 | 2.079442 | 0.000000 | 365 | 1 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... |
| 905 | 0.0 | 0.000000 | 1.0 | 52.36411 | 4.86245 | 0.693147 | 0.000000 | 0.693147 | 2 | 1 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 5192 | 0.0 | 0.000000 | 0.0 | 52.35497 | 4.88855 | 0.693147 | 0.000000 | 0.000000 | 7 | 1 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 12172 | 1.0 | 1.098612 | 1.0 | 52.36655 | 4.90930 | 1.386294 | 0.693147 | 0.693147 | 1125 | 1 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 235 | 0.0 | 0.000000 | 0.0 | 52.37577 | 4.86126 | 0.693147 | 0.000000 | 0.693147 | 9 | 1 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 13349 | 0.0 | 0.000000 | 1.0 | 52.30744 | 4.95474 | 1.386294 | 0.000000 | 0.000000 | 100 | 1 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
12880 rows × 65 columns
reg_ln = LinearRegression()
# train
reg_ln.fit(X_train, y_train)
# predict
y_pred_test = reg_ln.predict(X_test)
y_pred_train = reg_ln.predict(X_train)
Relationship between predictions and actual values graphically with a scatter plot
plt.figure(figsize=(10,10))
plt.scatter(y_test, y_pred_test, alpha = 0.5, color='#FF8080')
plt.xlabel("True Value",fontsize = 15)
plt.ylabel("Prediction",fontsize = 15)
plt.title('Price Prediction by Linear regressor',fontsize = 20)
# diagnal
plt.plot(y_test, y_test, ls="--", c=".3")
plt.show()
print("Training RMSE:", np.sqrt(metrics.mean_squared_error(y_train, y_pred_train)))
print("Validation RMSE:", np.sqrt(metrics.mean_squared_error(y_test, y_pred_test)))
print("\nTraining MSE:", metrics.mean_squared_error(y_train, y_pred_train))
print("Validation MSE:", metrics.mean_squared_error(y_test, y_pred_test))
print("\nTraining r2:", metrics.r2_score(y_train, y_pred_train))
print("Validation r2:", metrics.r2_score(y_test, y_pred_test))
Training RMSE: 0.3820519201018117 Validation RMSE: 0.3802375530471993 Training MSE: 0.14596366965348112 Validation MSE: 0.1445805967473217 Training r2: 0.48118312569903143 Validation r2: 0.4759696833694511
reg_rf = RandomForestRegressor()
# train
reg_rf.fit(X_train, y_train)
# predict
y_pred_test = reg_rf.predict(X_test)
y_pred_train = reg_rf.predict(X_train)
Relationship between predictions and actual values graphically with a scatter plot
plt.figure(figsize=(10,10))
plt.scatter(y_test, y_pred_test, alpha = 0.5, color='#FF8080')
plt.xlabel("True Value",fontsize=15)
plt.ylabel("Prediction",fontsize=15)
plt.title('Price Prediction by Random Forest Regressor',fontsize=20)
# diagnal
plt.plot(y_test, y_test, ls="--", c=".3")
plt.show()
print("Training RMSE:", np.sqrt(metrics.mean_squared_error(y_train, y_pred_train)))
print("Validation RMSE:", np.sqrt(metrics.mean_squared_error(y_test, y_pred_test)))
print("\nTraining MSE:", metrics.mean_squared_error(y_train, y_pred_train))
print("Validation MSE:", metrics.mean_squared_error(y_test, y_pred_test))
print("\nTraining r2:", metrics.r2_score(y_train, y_pred_train))
print("Validation r2:", metrics.r2_score(y_test, y_pred_test))
Training RMSE: 0.1394012727547156 Validation RMSE: 0.36683801177513914 Training MSE: 0.01943271484563461 Validation MSE: 0.1345701268831371 Training r2: 0.9309278781540025 Validation r2: 0.5122524890194866
Use RandomizedSearchCV to find optimized hyperparameter, including n_estimators,max_features,max_depth and min_samples_leaf.
param_grid = {
# Number of trees in random forest
'n_estimators' : [100,200,500],
# Number of features to consider at every split
'max_features' : ['auto', 'sqrt'],
# Maximum number of levels in tree
'max_depth' : [x for x in range(1,50,5)],
# Minimum number of samples required at each leaf node
'min_samples_leaf' : [2, 4, 10, 16],
}
tuned_rf = RandomizedSearchCV(reg_rf, param_grid, scoring='neg_mean_squared_error', n_iter = 10, cv = 5, verbose=2)
# Train the tuned model
tuned_rf.fit(X_train,y_train)
# Predict
y_pred_test = tuned_rf.predict(X_test)
y_pred_train = tuned_rf.predict(X_train)
Fitting 5 folds for each of 10 candidates, totalling 50 fits [CV] END max_depth=46, max_features=sqrt, min_samples_leaf=10, n_estimators=500; total time= 11.7s [CV] END max_depth=46, max_features=sqrt, min_samples_leaf=10, n_estimators=500; total time= 12.3s [CV] END max_depth=46, max_features=sqrt, min_samples_leaf=10, n_estimators=500; total time= 12.7s [CV] END max_depth=46, max_features=sqrt, min_samples_leaf=10, n_estimators=500; total time= 12.2s [CV] END max_depth=46, max_features=sqrt, min_samples_leaf=10, n_estimators=500; total time= 11.9s [CV] END max_depth=21, max_features=auto, min_samples_leaf=10, n_estimators=100; total time= 12.9s [CV] END max_depth=21, max_features=auto, min_samples_leaf=10, n_estimators=100; total time= 12.7s [CV] END max_depth=21, max_features=auto, min_samples_leaf=10, n_estimators=100; total time= 11.8s [CV] END max_depth=21, max_features=auto, min_samples_leaf=10, n_estimators=100; total time= 11.9s [CV] END max_depth=21, max_features=auto, min_samples_leaf=10, n_estimators=100; total time= 13.0s [CV] END max_depth=1, max_features=sqrt, min_samples_leaf=16, n_estimators=200; total time= 0.9s [CV] END max_depth=1, max_features=sqrt, min_samples_leaf=16, n_estimators=200; total time= 0.9s [CV] END max_depth=1, max_features=sqrt, min_samples_leaf=16, n_estimators=200; total time= 0.9s [CV] END max_depth=1, max_features=sqrt, min_samples_leaf=16, n_estimators=200; total time= 0.9s [CV] END max_depth=1, max_features=sqrt, min_samples_leaf=16, n_estimators=200; total time= 0.9s [CV] END max_depth=11, max_features=auto, min_samples_leaf=4, n_estimators=200; total time= 21.1s [CV] END max_depth=11, max_features=auto, min_samples_leaf=4, n_estimators=200; total time= 19.2s [CV] END max_depth=11, max_features=auto, min_samples_leaf=4, n_estimators=200; total time= 7.6s [CV] END max_depth=11, max_features=auto, min_samples_leaf=4, n_estimators=200; total time= 7.0s [CV] END max_depth=11, max_features=auto, min_samples_leaf=4, n_estimators=200; total time= 6.4s [CV] END max_depth=11, max_features=sqrt, min_samples_leaf=16, n_estimators=200; total time= 1.0s [CV] END max_depth=11, max_features=sqrt, min_samples_leaf=16, n_estimators=200; total time= 1.1s [CV] END max_depth=11, max_features=sqrt, min_samples_leaf=16, n_estimators=200; total time= 1.1s [CV] END max_depth=11, max_features=sqrt, min_samples_leaf=16, n_estimators=200; total time= 1.0s [CV] END max_depth=11, max_features=sqrt, min_samples_leaf=16, n_estimators=200; total time= 1.0s [CV] END max_depth=26, max_features=sqrt, min_samples_leaf=10, n_estimators=200; total time= 1.3s [CV] END max_depth=26, max_features=sqrt, min_samples_leaf=10, n_estimators=200; total time= 1.4s [CV] END max_depth=26, max_features=sqrt, min_samples_leaf=10, n_estimators=200; total time= 1.4s [CV] END max_depth=26, max_features=sqrt, min_samples_leaf=10, n_estimators=200; total time= 1.4s [CV] END max_depth=26, max_features=sqrt, min_samples_leaf=10, n_estimators=200; total time= 1.4s [CV] END max_depth=1, max_features=sqrt, min_samples_leaf=2, n_estimators=500; total time= 0.6s [CV] END max_depth=1, max_features=sqrt, min_samples_leaf=2, n_estimators=500; total time= 0.7s [CV] END max_depth=1, max_features=sqrt, min_samples_leaf=2, n_estimators=500; total time= 0.7s [CV] END max_depth=1, max_features=sqrt, min_samples_leaf=2, n_estimators=500; total time= 0.6s [CV] END max_depth=1, max_features=sqrt, min_samples_leaf=2, n_estimators=500; total time= 0.6s [CV] END max_depth=16, max_features=auto, min_samples_leaf=4, n_estimators=500; total time= 20.3s [CV] END max_depth=16, max_features=auto, min_samples_leaf=4, n_estimators=500; total time= 20.0s [CV] END max_depth=16, max_features=auto, min_samples_leaf=4, n_estimators=500; total time= 19.9s [CV] END max_depth=16, max_features=auto, min_samples_leaf=4, n_estimators=500; total time= 19.6s [CV] END max_depth=16, max_features=auto, min_samples_leaf=4, n_estimators=500; total time= 19.8s [CV] END max_depth=11, max_features=auto, min_samples_leaf=10, n_estimators=500; total time= 15.3s [CV] END max_depth=11, max_features=auto, min_samples_leaf=10, n_estimators=500; total time= 15.0s [CV] END max_depth=11, max_features=auto, min_samples_leaf=10, n_estimators=500; total time= 15.8s [CV] END max_depth=11, max_features=auto, min_samples_leaf=10, n_estimators=500; total time= 15.5s [CV] END max_depth=11, max_features=auto, min_samples_leaf=10, n_estimators=500; total time= 17.6s [CV] END max_depth=31, max_features=auto, min_samples_leaf=4, n_estimators=200; total time= 8.9s [CV] END max_depth=31, max_features=auto, min_samples_leaf=4, n_estimators=200; total time= 9.3s [CV] END max_depth=31, max_features=auto, min_samples_leaf=4, n_estimators=200; total time= 10.0s [CV] END max_depth=31, max_features=auto, min_samples_leaf=4, n_estimators=200; total time= 9.5s [CV] END max_depth=31, max_features=auto, min_samples_leaf=4, n_estimators=200; total time= 9.9s
Show relationship between predictions and actual values graphically with a scatter plot
plt.figure(figsize=(10,10))
plt.scatter(y_test, y_pred_test, alpha = 0.5, color='#FF8080')
plt.xlabel("True Value",fontsize=15)
plt.ylabel("Prediction",fontsize=15)
plt.title('Price Prediction by Tuned Random Forest Regressor',fontsize=20)
# diagnal
plt.plot(y_test, y_test, ls="--", c=".3")
plt.show()
print("Training RMSE:", np.sqrt(metrics.mean_squared_error(y_train, y_pred_train)))
print("Validation RMSE:",np.sqrt(metrics.mean_squared_error(y_test, y_pred_test)))
print("\nTraining MSE:", metrics.mean_squared_error(y_train, y_pred_train))
print("Validation MSE:", metrics.mean_squared_error(y_test, y_pred_test))
print("\nTraining r2:", metrics.r2_score(y_train, y_pred_train))
print("Validation r2:", metrics.r2_score(y_test, y_pred_test))
Training RMSE: 0.2693221458131423 Validation RMSE: 0.3641575184128079 Training MSE: 0.07253441822539548 Validation MSE: 0.13261069821657454 Training r2: 0.7421818714733759 Validation r2: 0.5193544103537044
input_dimension = X_train.shape[1]
# Initialising
model = Sequential()
# Adding input layer and first hidden layer
model.add(Dense(32, activation = 'relu', input_dim = input_dimension))
# Second hidden layer
model.add(Dense(units = 64, activation = 'relu'))
# Third hidden layer
model.add(Dense(units = 64, activation = 'relu'))
# Output layer
model.add(Dense(units = 1))
# optimizer
opt = keras.optimizers.Adam(learning_rate=0.001)
# Compiling
model.compile(optimizer = opt, loss = 'mean_squared_error', metrics=["mean_squared_error"])
# Fitting to the Training set
history = model.fit(X_train, y_train, batch_size = 32, epochs = 1000)
Epoch 1/1000 403/403 [==============================] - 1s 1ms/step - loss: 2.9998 - mean_squared_error: 2.9998 Epoch 2/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.2396 - mean_squared_error: 0.2396 Epoch 3/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.2390 - mean_squared_error: 0.2390 Epoch 4/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.2327 - mean_squared_error: 0.2327 Epoch 5/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.2037 - mean_squared_error: 0.2037 Epoch 6/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.2386 - mean_squared_error: 0.2386 Epoch 7/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.2167 - mean_squared_error: 0.2167 Epoch 8/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.2185 - mean_squared_error: 0.2185 Epoch 9/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.2110 - mean_squared_error: 0.2110 Epoch 10/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.2228 - mean_squared_error: 0.2228 Epoch 11/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.2237 - mean_squared_error: 0.2237 Epoch 12/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.2203 - mean_squared_error: 0.2203 Epoch 13/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1918 - mean_squared_error: 0.1918 Epoch 14/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1972 - mean_squared_error: 0.1972 Epoch 15/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1883 - mean_squared_error: 0.1883 Epoch 16/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1909 - mean_squared_error: 0.1909 Epoch 17/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1825 - mean_squared_error: 0.1825 Epoch 18/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1872 - mean_squared_error: 0.1872 Epoch 19/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1700 - mean_squared_error: 0.1700 Epoch 20/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1797 - mean_squared_error: 0.1797 Epoch 21/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1679 - mean_squared_error: 0.1679 Epoch 22/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1744 - mean_squared_error: 0.1744 Epoch 23/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1787 - mean_squared_error: 0.1787 Epoch 24/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1709 - mean_squared_error: 0.1709 Epoch 25/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1724 - mean_squared_error: 0.1724 Epoch 26/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1659 - mean_squared_error: 0.1659 Epoch 27/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1672 - mean_squared_error: 0.1672 Epoch 28/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1693 - mean_squared_error: 0.1693 Epoch 29/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1646 - mean_squared_error: 0.1646 Epoch 30/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1596 - mean_squared_error: 0.1596 Epoch 31/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1618 - mean_squared_error: 0.1618 Epoch 32/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1587 - mean_squared_error: 0.1587 Epoch 33/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1544 - mean_squared_error: 0.1544 Epoch 34/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1623 - mean_squared_error: 0.1623 Epoch 35/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1592 - mean_squared_error: 0.1592 Epoch 36/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1566 - mean_squared_error: 0.1566 Epoch 37/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1604 - mean_squared_error: 0.1604 Epoch 38/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1560 - mean_squared_error: 0.1560 Epoch 39/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1593 - mean_squared_error: 0.1593 Epoch 40/1000 403/403 [==============================] - ETA: 0s - loss: 0.1601 - mean_squared_error: 0.16 - 0s 1ms/step - loss: 0.1617 - mean_squared_error: 0.1617 Epoch 41/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1534 - mean_squared_error: 0.1534 Epoch 42/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1538 - mean_squared_error: 0.1538 Epoch 43/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1519 - mean_squared_error: 0.1519 Epoch 44/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1556 - mean_squared_error: 0.1556 Epoch 45/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1511 - mean_squared_error: 0.1511 Epoch 46/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1520 - mean_squared_error: 0.1520 Epoch 47/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1578 - mean_squared_error: 0.1578 Epoch 48/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1501 - mean_squared_error: 0.1501 Epoch 49/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1546 - mean_squared_error: 0.1546 Epoch 50/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1478 - mean_squared_error: 0.1478 Epoch 51/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1481 - mean_squared_error: 0.1481 Epoch 52/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1483 - mean_squared_error: 0.1483 Epoch 53/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1510 - mean_squared_error: 0.1510 Epoch 54/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1500 - mean_squared_error: 0.1500 Epoch 55/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1491 - mean_squared_error: 0.1491 Epoch 56/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1505 - mean_squared_error: 0.1505 Epoch 57/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1494 - mean_squared_error: 0.1494 Epoch 58/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1462 - mean_squared_error: 0.1462 Epoch 59/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1464 - mean_squared_error: 0.1464 Epoch 60/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1456 - mean_squared_error: 0.1456 Epoch 61/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1472 - mean_squared_error: 0.1472 Epoch 62/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1474 - mean_squared_error: 0.1474 Epoch 63/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1462 - mean_squared_error: 0.1462 Epoch 64/1000 403/403 [==============================] - 0s 971us/step - loss: 0.1463 - mean_squared_error: 0.1463 Epoch 65/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1448 - mean_squared_error: 0.1448 Epoch 66/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1468 - mean_squared_error: 0.1468 Epoch 67/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1476 - mean_squared_error: 0.1476 Epoch 68/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1448 - mean_squared_error: 0.1448 Epoch 69/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1440 - mean_squared_error: 0.1440 Epoch 70/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1488 - mean_squared_error: 0.1488 Epoch 71/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1436 - mean_squared_error: 0.1436 Epoch 72/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1451 - mean_squared_error: 0.1451 Epoch 73/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1440 - mean_squared_error: 0.1440 Epoch 74/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1442 - mean_squared_error: 0.1442 Epoch 75/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1453 - mean_squared_error: 0.1453 Epoch 76/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1460 - mean_squared_error: 0.1460 Epoch 77/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1441 - mean_squared_error: 0.1441 Epoch 78/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1458 - mean_squared_error: 0.1458 Epoch 79/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1430 - mean_squared_error: 0.1430 Epoch 80/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1419 - mean_squared_error: 0.1419 Epoch 81/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1409 - mean_squared_error: 0.1409 Epoch 82/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1415 - mean_squared_error: 0.1415 Epoch 83/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1423 - mean_squared_error: 0.1423 Epoch 84/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1423 - mean_squared_error: 0.1423 Epoch 85/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1429 - mean_squared_error: 0.1429 Epoch 86/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1445 - mean_squared_error: 0.1445 Epoch 87/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1400 - mean_squared_error: 0.1400: 0s - loss: 0.1397 - mean_squar Epoch 88/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1413 - mean_squared_error: 0.1413 Epoch 89/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1430 - mean_squared_error: 0.1430 Epoch 90/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1403 - mean_squared_error: 0.1403 Epoch 91/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1414 - mean_squared_error: 0.1414 Epoch 92/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1415 - mean_squared_error: 0.1415 Epoch 93/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1422 - mean_squared_error: 0.1422 Epoch 94/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1398 - mean_squared_error: 0.1398 Epoch 95/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1384 - mean_squared_error: 0.1384 Epoch 96/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1381 - mean_squared_error: 0.1381 Epoch 97/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1400 - mean_squared_error: 0.1400 Epoch 98/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1426 - mean_squared_error: 0.1426 Epoch 99/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1391 - mean_squared_error: 0.1391 Epoch 100/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1398 - mean_squared_error: 0.1398 Epoch 101/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1380 - mean_squared_error: 0.1380 Epoch 102/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1409 - mean_squared_error: 0.1409 Epoch 103/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1403 - mean_squared_error: 0.1403 Epoch 104/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1388 - mean_squared_error: 0.1388 Epoch 105/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1392 - mean_squared_error: 0.1392 Epoch 106/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1388 - mean_squared_error: 0.1388 Epoch 107/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1370 - mean_squared_error: 0.1370 Epoch 108/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1376 - mean_squared_error: 0.1376 Epoch 109/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1380 - mean_squared_error: 0.1380 Epoch 110/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1379 - mean_squared_error: 0.1379 Epoch 111/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1380 - mean_squared_error: 0.1380 Epoch 112/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1369 - mean_squared_error: 0.1369 Epoch 113/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1373 - mean_squared_error: 0.1373 Epoch 114/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1378 - mean_squared_error: 0.1378 Epoch 115/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1396 - mean_squared_error: 0.1396 Epoch 116/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1379 - mean_squared_error: 0.1379 Epoch 117/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1357 - mean_squared_error: 0.1357 Epoch 118/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1370 - mean_squared_error: 0.1370 Epoch 119/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1375 - mean_squared_error: 0.1375 Epoch 120/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1370 - mean_squared_error: 0.1370 Epoch 121/1000 403/403 [==============================] - 0s 968us/step - loss: 0.1360 - mean_squared_error: 0.1360 Epoch 122/1000 403/403 [==============================] - 0s 930us/step - loss: 0.1359 - mean_squared_error: 0.1359 Epoch 123/1000 403/403 [==============================] - 0s 953us/step - loss: 0.1379 - mean_squared_error: 0.1379 Epoch 124/1000 403/403 [==============================] - 0s 991us/step - loss: 0.1359 - mean_squared_error: 0.1359 Epoch 125/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1376 - mean_squared_error: 0.1376 Epoch 126/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1345 - mean_squared_error: 0.1345 Epoch 127/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1346 - mean_squared_error: 0.1346 Epoch 128/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1364 - mean_squared_error: 0.1364 Epoch 129/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1363 - mean_squared_error: 0.1363 Epoch 130/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1375 - mean_squared_error: 0.1375 Epoch 131/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1354 - mean_squared_error: 0.1354 Epoch 132/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1377 - mean_squared_error: 0.1377 Epoch 133/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1338 - mean_squared_error: 0.1338 Epoch 134/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1349 - mean_squared_error: 0.1349 Epoch 135/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1355 - mean_squared_error: 0.1355 Epoch 136/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1347 - mean_squared_error: 0.1347 Epoch 137/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1363 - mean_squared_error: 0.1363 Epoch 138/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1347 - mean_squared_error: 0.1347 Epoch 139/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1323 - mean_squared_error: 0.1323 Epoch 140/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1335 - mean_squared_error: 0.1335 Epoch 141/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1349 - mean_squared_error: 0.1349 Epoch 142/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1329 - mean_squared_error: 0.1329 Epoch 143/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1352 - mean_squared_error: 0.1352 Epoch 144/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1354 - mean_squared_error: 0.1354 Epoch 145/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1339 - mean_squared_error: 0.1339 Epoch 146/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1337 - mean_squared_error: 0.1337 Epoch 147/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1331 - mean_squared_error: 0.1331 Epoch 148/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1341 - mean_squared_error: 0.1341 Epoch 149/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1335 - mean_squared_error: 0.1335 Epoch 150/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1320 - mean_squared_error: 0.1320 Epoch 151/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1320 - mean_squared_error: 0.1320 Epoch 152/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1326 - mean_squared_error: 0.1326 Epoch 153/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1326 - mean_squared_error: 0.1326 Epoch 154/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1358 - mean_squared_error: 0.1358 Epoch 155/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1329 - mean_squared_error: 0.1329 Epoch 156/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1322 - mean_squared_error: 0.1322 Epoch 157/1000 403/403 [==============================] - 0s 946us/step - loss: 0.1325 - mean_squared_error: 0.1325 Epoch 158/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1322 - mean_squared_error: 0.1322 Epoch 159/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1306 - mean_squared_error: 0.1306 Epoch 160/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1331 - mean_squared_error: 0.1331 Epoch 161/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1309 - mean_squared_error: 0.1309 Epoch 162/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1325 - mean_squared_error: 0.1325 Epoch 163/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1306 - mean_squared_error: 0.1306 Epoch 164/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1309 - mean_squared_error: 0.1309 Epoch 165/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1355 - mean_squared_error: 0.1355 Epoch 166/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1337 - mean_squared_error: 0.1337 Epoch 167/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1297 - mean_squared_error: 0.1297 Epoch 168/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1303 - mean_squared_error: 0.1303 Epoch 169/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1304 - mean_squared_error: 0.1304 Epoch 170/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1324 - mean_squared_error: 0.1324 Epoch 171/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1316 - mean_squared_error: 0.1316 Epoch 172/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1308 - mean_squared_error: 0.1308 Epoch 173/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1316 - mean_squared_error: 0.1316 Epoch 174/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1308 - mean_squared_error: 0.1308 Epoch 175/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1294 - mean_squared_error: 0.1294 Epoch 176/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1308 - mean_squared_error: 0.1308 Epoch 177/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1313 - mean_squared_error: 0.1313 Epoch 178/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1310 - mean_squared_error: 0.1310 Epoch 179/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1314 - mean_squared_error: 0.1314 Epoch 180/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1281 - mean_squared_error: 0.1281 Epoch 181/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1282 - mean_squared_error: 0.1282 Epoch 182/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1297 - mean_squared_error: 0.1297 Epoch 183/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1292 - mean_squared_error: 0.1292 Epoch 184/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1313 - mean_squared_error: 0.1313 Epoch 185/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1292 - mean_squared_error: 0.1292 Epoch 186/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1288 - mean_squared_error: 0.1288 Epoch 187/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1292 - mean_squared_error: 0.1292 Epoch 188/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1261 - mean_squared_error: 0.1261 Epoch 189/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1306 - mean_squared_error: 0.1306 Epoch 190/1000 403/403 [==============================] - 0s 980us/step - loss: 0.1285 - mean_squared_error: 0.1285 Epoch 191/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1283 - mean_squared_error: 0.1283 Epoch 192/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1278 - mean_squared_error: 0.1278 Epoch 193/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1297 - mean_squared_error: 0.1297 Epoch 194/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1276 - mean_squared_error: 0.1276 Epoch 195/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1276 - mean_squared_error: 0.1276 Epoch 196/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1313 - mean_squared_error: 0.1313 Epoch 197/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1281 - mean_squared_error: 0.1281 Epoch 198/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1287 - mean_squared_error: 0.1287 Epoch 199/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1294 - mean_squared_error: 0.1294 Epoch 200/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1308 - mean_squared_error: 0.1308 Epoch 201/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1283 - mean_squared_error: 0.1283 Epoch 202/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1274 - mean_squared_error: 0.1274 Epoch 203/1000 403/403 [==============================] - 0s 955us/step - loss: 0.1276 - mean_squared_error: 0.1276 Epoch 204/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1282 - mean_squared_error: 0.1282 Epoch 205/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1275 - mean_squared_error: 0.1275 Epoch 206/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1282 - mean_squared_error: 0.1282 Epoch 207/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1280 - mean_squared_error: 0.1280 Epoch 208/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1279 - mean_squared_error: 0.1279 Epoch 209/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1272 - mean_squared_error: 0.1272 Epoch 210/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1269 - mean_squared_error: 0.1269 Epoch 211/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1297 - mean_squared_error: 0.1297 Epoch 212/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1274 - mean_squared_error: 0.1274 Epoch 213/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1278 - mean_squared_error: 0.1278 Epoch 214/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1266 - mean_squared_error: 0.1266 Epoch 215/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1270 - mean_squared_error: 0.1270 Epoch 216/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1281 - mean_squared_error: 0.1281 Epoch 217/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1265 - mean_squared_error: 0.1265 Epoch 218/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1266 - mean_squared_error: 0.1266 Epoch 219/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1266 - mean_squared_error: 0.1266 Epoch 220/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1270 - mean_squared_error: 0.1270 Epoch 221/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1273 - mean_squared_error: 0.1273 Epoch 222/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1270 - mean_squared_error: 0.1270 Epoch 223/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1273 - mean_squared_error: 0.1273 Epoch 224/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1252 - mean_squared_error: 0.1252 Epoch 225/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1269 - mean_squared_error: 0.1269 Epoch 226/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1262 - mean_squared_error: 0.1262 Epoch 227/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1269 - mean_squared_error: 0.1269 Epoch 228/1000 403/403 [==============================] - 0s 951us/step - loss: 0.1269 - mean_squared_error: 0.1269 Epoch 229/1000 403/403 [==============================] - 0s 981us/step - loss: 0.1291 - mean_squared_error: 0.1291 Epoch 230/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1262 - mean_squared_error: 0.1262 Epoch 231/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1271 - mean_squared_error: 0.1271 Epoch 232/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1248 - mean_squared_error: 0.1248 Epoch 233/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1241 - mean_squared_error: 0.1241 Epoch 234/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1269 - mean_squared_error: 0.1269 Epoch 235/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1258 - mean_squared_error: 0.1258 Epoch 236/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1249 - mean_squared_error: 0.1249 Epoch 237/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1271 - mean_squared_error: 0.1271 Epoch 238/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1262 - mean_squared_error: 0.1262 Epoch 239/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1249 - mean_squared_error: 0.1249 Epoch 240/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1265 - mean_squared_error: 0.1265 Epoch 241/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1271 - mean_squared_error: 0.1271 Epoch 242/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1269 - mean_squared_error: 0.1269 Epoch 243/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1267 - mean_squared_error: 0.1267 Epoch 244/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1244 - mean_squared_error: 0.1244 Epoch 245/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1257 - mean_squared_error: 0.1257 Epoch 246/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1242 - mean_squared_error: 0.1242 Epoch 247/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1260 - mean_squared_error: 0.1260 Epoch 248/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1235 - mean_squared_error: 0.1235 Epoch 249/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1267 - mean_squared_error: 0.1267 Epoch 250/1000 403/403 [==============================] - 0s 954us/step - loss: 0.1255 - mean_squared_error: 0.1255 Epoch 251/1000 403/403 [==============================] - 0s 879us/step - loss: 0.1250 - mean_squared_error: 0.1250 Epoch 252/1000 403/403 [==============================] - 0s 955us/step - loss: 0.1256 - mean_squared_error: 0.1256 Epoch 253/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1237 - mean_squared_error: 0.1237 Epoch 254/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1262 - mean_squared_error: 0.1262 Epoch 255/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1263 - mean_squared_error: 0.1263 Epoch 256/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1243 - mean_squared_error: 0.1243 Epoch 257/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1234 - mean_squared_error: 0.1234 Epoch 258/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1237 - mean_squared_error: 0.1237 Epoch 259/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1239 - mean_squared_error: 0.1239 Epoch 260/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1250 - mean_squared_error: 0.1250 Epoch 261/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1233 - mean_squared_error: 0.1233 Epoch 262/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1254 - mean_squared_error: 0.1254 Epoch 263/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1257 - mean_squared_error: 0.1257 Epoch 264/1000 403/403 [==============================] - 0s 970us/step - loss: 0.1239 - mean_squared_error: 0.1239 Epoch 265/1000 403/403 [==============================] - 0s 949us/step - loss: 0.1227 - mean_squared_error: 0.1227 Epoch 266/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1225 - mean_squared_error: 0.1225 Epoch 267/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1257 - mean_squared_error: 0.1257 Epoch 268/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1225 - mean_squared_error: 0.1225 Epoch 269/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1231 - mean_squared_error: 0.1231 Epoch 270/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1245 - mean_squared_error: 0.1245 Epoch 271/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1233 - mean_squared_error: 0.1233 Epoch 272/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1232 - mean_squared_error: 0.1232 Epoch 273/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1257 - mean_squared_error: 0.1257 Epoch 274/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1228 - mean_squared_error: 0.1228 Epoch 275/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1230 - mean_squared_error: 0.1230 Epoch 276/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1246 - mean_squared_error: 0.1246 Epoch 277/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1234 - mean_squared_error: 0.1234 Epoch 278/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1242 - mean_squared_error: 0.1242 Epoch 279/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1216 - mean_squared_error: 0.1216 Epoch 280/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1231 - mean_squared_error: 0.1231 Epoch 281/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1232 - mean_squared_error: 0.1232 Epoch 282/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1231 - mean_squared_error: 0.1231 Epoch 283/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1246 - mean_squared_error: 0.1246 Epoch 284/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1229 - mean_squared_error: 0.1229 Epoch 285/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1220 - mean_squared_error: 0.1220 Epoch 286/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1238 - mean_squared_error: 0.1238 Epoch 287/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1219 - mean_squared_error: 0.1219 Epoch 288/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1222 - mean_squared_error: 0.1222 Epoch 289/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1230 - mean_squared_error: 0.1230 Epoch 290/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1234 - mean_squared_error: 0.1234 Epoch 291/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1208 - mean_squared_error: 0.1208 Epoch 292/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1217 - mean_squared_error: 0.1217 Epoch 293/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1235 - mean_squared_error: 0.1235 Epoch 294/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1235 - mean_squared_error: 0.1235 Epoch 295/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1229 - mean_squared_error: 0.1229 Epoch 296/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1211 - mean_squared_error: 0.1211 Epoch 297/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1238 - mean_squared_error: 0.1238 Epoch 298/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1238 - mean_squared_error: 0.1238 Epoch 299/1000 403/403 [==============================] - 0s 995us/step - loss: 0.1215 - mean_squared_error: 0.1215 Epoch 300/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1215 - mean_squared_error: 0.1215 Epoch 301/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1221 - mean_squared_error: 0.1221 Epoch 302/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1218 - mean_squared_error: 0.1218 Epoch 303/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1221 - mean_squared_error: 0.1221 Epoch 304/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1217 - mean_squared_error: 0.1217 Epoch 305/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1235 - mean_squared_error: 0.1235 Epoch 306/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1222 - mean_squared_error: 0.1222 Epoch 307/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1202 - mean_squared_error: 0.1202 Epoch 308/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1222 - mean_squared_error: 0.1222 Epoch 309/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1212 - mean_squared_error: 0.1212 Epoch 310/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1216 - mean_squared_error: 0.1216 Epoch 311/1000 403/403 [==============================] - 0s 995us/step - loss: 0.1205 - mean_squared_error: 0.1205 Epoch 312/1000 403/403 [==============================] - ETA: 0s - loss: 0.1216 - mean_squared_error: 0.12 - 0s 974us/step - loss: 0.1225 - mean_squared_error: 0.1225 Epoch 313/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1204 - mean_squared_error: 0.1204 Epoch 314/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1210 - mean_squared_error: 0.1210 Epoch 315/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1217 - mean_squared_error: 0.1217 Epoch 316/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1206 - mean_squared_error: 0.1206 Epoch 317/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1215 - mean_squared_error: 0.1215 Epoch 318/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1203 - mean_squared_error: 0.1203 Epoch 319/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1217 - mean_squared_error: 0.1217 Epoch 320/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1224 - mean_squared_error: 0.1224 Epoch 321/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1219 - mean_squared_error: 0.1219 Epoch 322/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1238 - mean_squared_error: 0.1238 Epoch 323/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1209 - mean_squared_error: 0.1209 Epoch 324/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1218 - mean_squared_error: 0.1218 Epoch 325/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1203 - mean_squared_error: 0.1203 Epoch 326/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1208 - mean_squared_error: 0.1208 Epoch 327/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1212 - mean_squared_error: 0.1212 Epoch 328/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1224 - mean_squared_error: 0.1224 Epoch 329/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1194 - mean_squared_error: 0.1194 Epoch 330/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1226 - mean_squared_error: 0.1226 Epoch 331/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1210 - mean_squared_error: 0.1210 Epoch 332/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1203 - mean_squared_error: 0.1203 Epoch 333/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1207 - mean_squared_error: 0.1207 Epoch 334/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1216 - mean_squared_error: 0.1216 Epoch 335/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1224 - mean_squared_error: 0.1224 Epoch 336/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1199 - mean_squared_error: 0.1199 Epoch 337/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1203 - mean_squared_error: 0.1203 Epoch 338/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1190 - mean_squared_error: 0.1190 Epoch 339/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1217 - mean_squared_error: 0.1217 Epoch 340/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1216 - mean_squared_error: 0.1216 Epoch 341/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1196 - mean_squared_error: 0.1196 Epoch 342/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1209 - mean_squared_error: 0.1209 Epoch 343/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1196 - mean_squared_error: 0.1196 Epoch 344/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1214 - mean_squared_error: 0.1214 Epoch 345/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1202 - mean_squared_error: 0.1202 Epoch 346/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1190 - mean_squared_error: 0.1190 Epoch 347/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1203 - mean_squared_error: 0.1203 Epoch 348/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1191 - mean_squared_error: 0.1191 Epoch 349/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1203 - mean_squared_error: 0.1203 Epoch 350/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1195 - mean_squared_error: 0.1195 Epoch 351/1000 403/403 [==============================] - 0s 994us/step - loss: 0.1213 - mean_squared_error: 0.1213 Epoch 352/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1193 - mean_squared_error: 0.1193 Epoch 353/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1182 - mean_squared_error: 0.1182 Epoch 354/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1188 - mean_squared_error: 0.1188 Epoch 355/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1206 - mean_squared_error: 0.1206 Epoch 356/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1206 - mean_squared_error: 0.1206 Epoch 357/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1217 - mean_squared_error: 0.1217 Epoch 358/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1204 - mean_squared_error: 0.1204 Epoch 359/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1192 - mean_squared_error: 0.1192 Epoch 360/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1202 - mean_squared_error: 0.1202 Epoch 361/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1186 - mean_squared_error: 0.1186 Epoch 362/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1204 - mean_squared_error: 0.1204 Epoch 363/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1192 - mean_squared_error: 0.1192 Epoch 364/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1206 - mean_squared_error: 0.1206 Epoch 365/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1185 - mean_squared_error: 0.1185 Epoch 366/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1207 - mean_squared_error: 0.1207 Epoch 367/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1214 - mean_squared_error: 0.1214 Epoch 368/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1203 - mean_squared_error: 0.1203 Epoch 369/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1196 - mean_squared_error: 0.1196 Epoch 370/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1199 - mean_squared_error: 0.1199 Epoch 371/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1204 - mean_squared_error: 0.1204 Epoch 372/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1187 - mean_squared_error: 0.1187 Epoch 373/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1192 - mean_squared_error: 0.1192 Epoch 374/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1183 - mean_squared_error: 0.1183 Epoch 375/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1180 - mean_squared_error: 0.1180 Epoch 376/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1203 - mean_squared_error: 0.1203 Epoch 377/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1189 - mean_squared_error: 0.1189 Epoch 378/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1204 - mean_squared_error: 0.1204 Epoch 379/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1201 - mean_squared_error: 0.1201 Epoch 380/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1192 - mean_squared_error: 0.1192 Epoch 381/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1183 - mean_squared_error: 0.1183 Epoch 382/1000 403/403 [==============================] - 0s 981us/step - loss: 0.1177 - mean_squared_error: 0.1177 Epoch 383/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1180 - mean_squared_error: 0.1180 Epoch 384/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1193 - mean_squared_error: 0.1193 Epoch 385/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1185 - mean_squared_error: 0.1185 Epoch 386/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1190 - mean_squared_error: 0.1190 Epoch 387/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1179 - mean_squared_error: 0.1179 Epoch 388/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1179 - mean_squared_error: 0.1179 Epoch 389/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1209 - mean_squared_error: 0.1209 Epoch 390/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1182 - mean_squared_error: 0.1182 Epoch 391/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1193 - mean_squared_error: 0.1193 Epoch 392/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1193 - mean_squared_error: 0.1193 Epoch 393/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1194 - mean_squared_error: 0.1194 Epoch 394/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1176 - mean_squared_error: 0.1176 Epoch 395/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1201 - mean_squared_error: 0.1201 Epoch 396/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1181 - mean_squared_error: 0.1181 Epoch 397/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1194 - mean_squared_error: 0.1194 Epoch 398/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1191 - mean_squared_error: 0.1191 Epoch 399/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1197 - mean_squared_error: 0.1197 Epoch 400/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1178 - mean_squared_error: 0.1178 Epoch 401/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1194 - mean_squared_error: 0.1194 Epoch 402/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1178 - mean_squared_error: 0.1178 Epoch 403/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1173 - mean_squared_error: 0.1173 Epoch 404/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1196 - mean_squared_error: 0.1196 Epoch 405/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1183 - mean_squared_error: 0.1183 Epoch 406/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1175 - mean_squared_error: 0.1175 Epoch 407/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1184 - mean_squared_error: 0.1184 Epoch 408/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1176 - mean_squared_error: 0.1176 Epoch 409/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1173 - mean_squared_error: 0.1173 Epoch 410/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1176 - mean_squared_error: 0.1176: 0s - loss: 0.1202 - mean_squared_err Epoch 411/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1179 - mean_squared_error: 0.1179 Epoch 412/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1193 - mean_squared_error: 0.1193 Epoch 413/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1180 - mean_squared_error: 0.1180 Epoch 414/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1173 - mean_squared_error: 0.1173 Epoch 415/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1172 - mean_squared_error: 0.1172 Epoch 416/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1169 - mean_squared_error: 0.1169 Epoch 417/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1177 - mean_squared_error: 0.1177 Epoch 418/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1175 - mean_squared_error: 0.1175 Epoch 419/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1190 - mean_squared_error: 0.1190 Epoch 420/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1177 - mean_squared_error: 0.1177 Epoch 421/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1181 - mean_squared_error: 0.1181 Epoch 422/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1169 - mean_squared_error: 0.1169 Epoch 423/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1176 - mean_squared_error: 0.1176 Epoch 424/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1184 - mean_squared_error: 0.1184 Epoch 425/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1173 - mean_squared_error: 0.1173 Epoch 426/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1173 - mean_squared_error: 0.1173 Epoch 427/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1168 - mean_squared_error: 0.1168 Epoch 428/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1185 - mean_squared_error: 0.1185 Epoch 429/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1166 - mean_squared_error: 0.1166 Epoch 430/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1171 - mean_squared_error: 0.1171 Epoch 431/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1180 - mean_squared_error: 0.1180 Epoch 432/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1172 - mean_squared_error: 0.1172 Epoch 433/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1171 - mean_squared_error: 0.1171 Epoch 434/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1166 - mean_squared_error: 0.1166 Epoch 435/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1162 - mean_squared_error: 0.1162 Epoch 436/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1188 - mean_squared_error: 0.1188 Epoch 437/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1191 - mean_squared_error: 0.1191 Epoch 438/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1168 - mean_squared_error: 0.1168 Epoch 439/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1176 - mean_squared_error: 0.1176 Epoch 440/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1161 - mean_squared_error: 0.1161 Epoch 441/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1188 - mean_squared_error: 0.1188 Epoch 442/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1173 - mean_squared_error: 0.1173 Epoch 443/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1166 - mean_squared_error: 0.1166 Epoch 444/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1181 - mean_squared_error: 0.1181 Epoch 445/1000 403/403 [==============================] - 0s 967us/step - loss: 0.1188 - mean_squared_error: 0.1188 Epoch 446/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1164 - mean_squared_error: 0.1164 Epoch 447/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1174 - mean_squared_error: 0.1174: 0s - loss: 0.1173 - mean_squared_e Epoch 448/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1171 - mean_squared_error: 0.1171 Epoch 449/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1179 - mean_squared_error: 0.1179 Epoch 450/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1172 - mean_squared_error: 0.1172 Epoch 451/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1167 - mean_squared_error: 0.1167 Epoch 452/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1177 - mean_squared_error: 0.1177 Epoch 453/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1163 - mean_squared_error: 0.1163 Epoch 454/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1178 - mean_squared_error: 0.1178 Epoch 455/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1161 - mean_squared_error: 0.1161 Epoch 456/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1169 - mean_squared_error: 0.1169 Epoch 457/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1162 - mean_squared_error: 0.1162 Epoch 458/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1160 - mean_squared_error: 0.1160 Epoch 459/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1172 - mean_squared_error: 0.1172 Epoch 460/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1162 - mean_squared_error: 0.1162 Epoch 461/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1165 - mean_squared_error: 0.1165 Epoch 462/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1177 - mean_squared_error: 0.1177 Epoch 463/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1157 - mean_squared_error: 0.1157 Epoch 464/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1160 - mean_squared_error: 0.1160 Epoch 465/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1170 - mean_squared_error: 0.1170 Epoch 466/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1178 - mean_squared_error: 0.1178 Epoch 467/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1171 - mean_squared_error: 0.1171 Epoch 468/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1176 - mean_squared_error: 0.1176 Epoch 469/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1160 - mean_squared_error: 0.1160 Epoch 470/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1151 - mean_squared_error: 0.1151 Epoch 471/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1158 - mean_squared_error: 0.1158 Epoch 472/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1187 - mean_squared_error: 0.1187 Epoch 473/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1159 - mean_squared_error: 0.1159 Epoch 474/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1167 - mean_squared_error: 0.1167 Epoch 475/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1163 - mean_squared_error: 0.1163: 0s - loss: 0.1187 - mean_squared_e Epoch 476/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1164 - mean_squared_error: 0.1164 Epoch 477/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1152 - mean_squared_error: 0.1152 Epoch 478/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1182 - mean_squared_error: 0.1182 Epoch 479/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1150 - mean_squared_error: 0.1150 Epoch 480/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1165 - mean_squared_error: 0.1165 Epoch 481/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1175 - mean_squared_error: 0.1175 Epoch 482/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1177 - mean_squared_error: 0.1177 Epoch 483/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1167 - mean_squared_error: 0.1167 Epoch 484/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1165 - mean_squared_error: 0.1165 Epoch 485/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1185 - mean_squared_error: 0.1185 Epoch 486/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1146 - mean_squared_error: 0.1146 Epoch 487/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1159 - mean_squared_error: 0.1159 Epoch 488/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1152 - mean_squared_error: 0.1152 Epoch 489/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1160 - mean_squared_error: 0.1160 Epoch 490/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1155 - mean_squared_error: 0.1155 Epoch 491/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1161 - mean_squared_error: 0.1161 Epoch 492/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1164 - mean_squared_error: 0.1164 Epoch 493/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1161 - mean_squared_error: 0.1161 Epoch 494/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1166 - mean_squared_error: 0.1166 Epoch 495/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1160 - mean_squared_error: 0.1160 Epoch 496/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1164 - mean_squared_error: 0.1164 Epoch 497/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1158 - mean_squared_error: 0.1158 Epoch 498/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1162 - mean_squared_error: 0.1162 Epoch 499/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1147 - mean_squared_error: 0.1147 Epoch 500/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1157 - mean_squared_error: 0.1157 Epoch 501/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1163 - mean_squared_error: 0.1163 Epoch 502/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1152 - mean_squared_error: 0.1152 Epoch 503/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1154 - mean_squared_error: 0.1154 Epoch 504/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1154 - mean_squared_error: 0.1154 Epoch 505/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1151 - mean_squared_error: 0.1151 Epoch 506/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1162 - mean_squared_error: 0.1162 Epoch 507/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1159 - mean_squared_error: 0.1159 Epoch 508/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1137 - mean_squared_error: 0.1137 Epoch 509/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1154 - mean_squared_error: 0.1154 Epoch 510/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1154 - mean_squared_error: 0.1154 Epoch 511/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1155 - mean_squared_error: 0.1155 Epoch 512/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1150 - mean_squared_error: 0.1150 Epoch 513/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1151 - mean_squared_error: 0.1151 Epoch 514/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1153 - mean_squared_error: 0.1153 Epoch 515/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1147 - mean_squared_error: 0.1147 Epoch 516/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1138 - mean_squared_error: 0.1138 Epoch 517/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1164 - mean_squared_error: 0.1164 Epoch 518/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1177 - mean_squared_error: 0.1177 Epoch 519/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1152 - mean_squared_error: 0.1152 Epoch 520/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1145 - mean_squared_error: 0.1145 Epoch 521/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1171 - mean_squared_error: 0.1171 Epoch 522/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1155 - mean_squared_error: 0.1155 Epoch 523/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1158 - mean_squared_error: 0.1158 Epoch 524/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1153 - mean_squared_error: 0.1153 Epoch 525/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1157 - mean_squared_error: 0.1157 Epoch 526/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1146 - mean_squared_error: 0.1146 Epoch 527/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1146 - mean_squared_error: 0.1146 Epoch 528/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1153 - mean_squared_error: 0.1153 Epoch 529/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1173 - mean_squared_error: 0.1173 Epoch 530/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1168 - mean_squared_error: 0.1168 Epoch 531/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1143 - mean_squared_error: 0.1143 Epoch 532/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1137 - mean_squared_error: 0.1137 Epoch 533/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1159 - mean_squared_error: 0.1159 Epoch 534/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1139 - mean_squared_error: 0.1139 Epoch 535/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1155 - mean_squared_error: 0.1155 Epoch 536/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1142 - mean_squared_error: 0.1142 Epoch 537/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1141 - mean_squared_error: 0.1141 Epoch 538/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1148 - mean_squared_error: 0.1148 Epoch 539/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1167 - mean_squared_error: 0.1167 Epoch 540/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1158 - mean_squared_error: 0.1158 Epoch 541/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1155 - mean_squared_error: 0.1155 Epoch 542/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1154 - mean_squared_error: 0.1154 Epoch 543/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1160 - mean_squared_error: 0.1160 Epoch 544/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1140 - mean_squared_error: 0.1140 Epoch 545/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1149 - mean_squared_error: 0.1149 Epoch 546/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1149 - mean_squared_error: 0.1149 Epoch 547/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1136 - mean_squared_error: 0.1136 Epoch 548/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1154 - mean_squared_error: 0.1154 Epoch 549/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1142 - mean_squared_error: 0.1142 Epoch 550/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1177 - mean_squared_error: 0.1177 Epoch 551/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1150 - mean_squared_error: 0.1150 Epoch 552/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1138 - mean_squared_error: 0.1138 Epoch 553/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1153 - mean_squared_error: 0.1153 Epoch 554/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1149 - mean_squared_error: 0.1149 Epoch 555/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1141 - mean_squared_error: 0.1141 Epoch 556/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1139 - mean_squared_error: 0.1139 Epoch 557/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1142 - mean_squared_error: 0.1142 Epoch 558/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1158 - mean_squared_error: 0.1158 Epoch 559/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1142 - mean_squared_error: 0.1142 Epoch 560/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1154 - mean_squared_error: 0.1154 Epoch 561/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1139 - mean_squared_error: 0.1139 Epoch 562/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1142 - mean_squared_error: 0.1142 Epoch 563/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1142 - mean_squared_error: 0.1142 Epoch 564/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1139 - mean_squared_error: 0.1139 Epoch 565/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1152 - mean_squared_error: 0.1152 Epoch 566/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1138 - mean_squared_error: 0.1138 Epoch 567/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1146 - mean_squared_error: 0.1146 Epoch 568/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1141 - mean_squared_error: 0.1141 Epoch 569/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1149 - mean_squared_error: 0.1149 Epoch 570/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1136 - mean_squared_error: 0.1136 Epoch 571/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1130 - mean_squared_error: 0.1130 Epoch 572/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1151 - mean_squared_error: 0.1151 Epoch 573/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1141 - mean_squared_error: 0.1141 Epoch 574/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1145 - mean_squared_error: 0.1145 Epoch 575/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1133 - mean_squared_error: 0.1133 Epoch 576/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1154 - mean_squared_error: 0.1154 Epoch 577/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1138 - mean_squared_error: 0.1138 Epoch 578/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1131 - mean_squared_error: 0.1131 Epoch 579/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1129 - mean_squared_error: 0.1129 Epoch 580/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1156 - mean_squared_error: 0.1156 Epoch 581/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1147 - mean_squared_error: 0.1147 Epoch 582/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1137 - mean_squared_error: 0.1137 Epoch 583/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1143 - mean_squared_error: 0.1143 Epoch 584/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1126 - mean_squared_error: 0.1126 Epoch 585/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1142 - mean_squared_error: 0.1142 Epoch 586/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1144 - mean_squared_error: 0.1144 Epoch 587/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1132 - mean_squared_error: 0.1132 Epoch 588/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1133 - mean_squared_error: 0.1133 Epoch 589/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1151 - mean_squared_error: 0.1151 Epoch 590/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1133 - mean_squared_error: 0.1133 Epoch 591/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1143 - mean_squared_error: 0.1143 Epoch 592/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1139 - mean_squared_error: 0.1139 Epoch 593/1000 403/403 [==============================] - ETA: 0s - loss: 0.1130 - mean_squared_error: 0.11 - 1s 1ms/step - loss: 0.1126 - mean_squared_error: 0.1126 Epoch 594/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1132 - mean_squared_error: 0.1132 Epoch 595/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1145 - mean_squared_error: 0.1145 Epoch 596/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1131 - mean_squared_error: 0.1131 Epoch 597/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1144 - mean_squared_error: 0.1144 Epoch 598/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1140 - mean_squared_error: 0.1140 Epoch 599/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1130 - mean_squared_error: 0.1130 Epoch 600/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1146 - mean_squared_error: 0.1146 Epoch 601/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1130 - mean_squared_error: 0.1130 Epoch 602/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1135 - mean_squared_error: 0.1135 Epoch 603/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1135 - mean_squared_error: 0.1135 Epoch 604/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1149 - mean_squared_error: 0.1149 Epoch 605/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1121 - mean_squared_error: 0.1121 Epoch 606/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1139 - mean_squared_error: 0.1139 Epoch 607/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1144 - mean_squared_error: 0.1144 Epoch 608/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1133 - mean_squared_error: 0.1133 Epoch 609/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1134 - mean_squared_error: 0.1134 Epoch 610/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1137 - mean_squared_error: 0.1137 Epoch 611/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1121 - mean_squared_error: 0.1121 Epoch 612/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1137 - mean_squared_error: 0.1137 Epoch 613/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1135 - mean_squared_error: 0.1135 Epoch 614/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1134 - mean_squared_error: 0.1134 Epoch 615/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1133 - mean_squared_error: 0.1133 Epoch 616/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1158 - mean_squared_error: 0.1158 Epoch 617/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1133 - mean_squared_error: 0.1133 Epoch 618/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1124 - mean_squared_error: 0.1124 Epoch 619/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1144 - mean_squared_error: 0.1144 Epoch 620/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1130 - mean_squared_error: 0.1130 Epoch 621/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1129 - mean_squared_error: 0.1129 Epoch 622/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1138 - mean_squared_error: 0.1138 Epoch 623/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1135 - mean_squared_error: 0.1135 Epoch 624/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1140 - mean_squared_error: 0.1140 Epoch 625/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1134 - mean_squared_error: 0.1134 Epoch 626/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1130 - mean_squared_error: 0.1130 Epoch 627/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1127 - mean_squared_error: 0.1127 Epoch 628/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1119 - mean_squared_error: 0.1119 Epoch 629/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1122 - mean_squared_error: 0.1122 Epoch 630/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1141 - mean_squared_error: 0.1141 Epoch 631/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1135 - mean_squared_error: 0.1135 Epoch 632/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1137 - mean_squared_error: 0.1137 Epoch 633/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1137 - mean_squared_error: 0.1137 Epoch 634/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1123 - mean_squared_error: 0.1123 Epoch 635/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1118 - mean_squared_error: 0.1118 Epoch 636/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1138 - mean_squared_error: 0.1138 Epoch 637/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1114 - mean_squared_error: 0.1114 Epoch 638/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1129 - mean_squared_error: 0.1129 Epoch 639/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1118 - mean_squared_error: 0.1118 Epoch 640/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1139 - mean_squared_error: 0.1139 Epoch 641/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1131 - mean_squared_error: 0.1131 Epoch 642/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1131 - mean_squared_error: 0.1131 Epoch 643/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1122 - mean_squared_error: 0.1122 Epoch 644/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1116 - mean_squared_error: 0.1116 Epoch 645/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1125 - mean_squared_error: 0.1125 Epoch 646/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1131 - mean_squared_error: 0.1131 Epoch 647/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1117 - mean_squared_error: 0.1117 Epoch 648/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1122 - mean_squared_error: 0.1122 Epoch 649/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1138 - mean_squared_error: 0.1138 Epoch 650/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1129 - mean_squared_error: 0.1129 Epoch 651/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1129 - mean_squared_error: 0.1129 Epoch 652/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1144 - mean_squared_error: 0.1144 Epoch 653/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1120 - mean_squared_error: 0.1120 Epoch 654/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1118 - mean_squared_error: 0.1118 Epoch 655/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1129 - mean_squared_error: 0.1129 Epoch 656/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1134 - mean_squared_error: 0.1134 Epoch 657/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1146 - mean_squared_error: 0.1146 Epoch 658/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1167 - mean_squared_error: 0.1167 Epoch 659/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1135 - mean_squared_error: 0.1135 Epoch 660/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1124 - mean_squared_error: 0.1124 Epoch 661/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1116 - mean_squared_error: 0.1116 Epoch 662/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1129 - mean_squared_error: 0.1129 Epoch 663/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1124 - mean_squared_error: 0.1124 Epoch 664/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1125 - mean_squared_error: 0.1125 Epoch 665/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1126 - mean_squared_error: 0.1126 Epoch 666/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1128 - mean_squared_error: 0.1128 Epoch 667/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1131 - mean_squared_error: 0.1131 Epoch 668/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1125 - mean_squared_error: 0.1125 Epoch 669/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1133 - mean_squared_error: 0.1133 Epoch 670/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1124 - mean_squared_error: 0.1124 Epoch 671/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1113 - mean_squared_error: 0.1113 Epoch 672/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1111 - mean_squared_error: 0.1111 Epoch 673/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1121 - mean_squared_error: 0.1121 Epoch 674/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1122 - mean_squared_error: 0.1122: 0s - loss: 0.1144 - mean_s Epoch 675/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1118 - mean_squared_error: 0.1118 Epoch 676/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1119 - mean_squared_error: 0.1119 Epoch 677/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1137 - mean_squared_error: 0.1137 Epoch 678/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1127 - mean_squared_error: 0.1127 Epoch 679/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1119 - mean_squared_error: 0.1119 Epoch 680/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1140 - mean_squared_error: 0.1140 Epoch 681/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1118 - mean_squared_error: 0.1118 Epoch 682/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1128 - mean_squared_error: 0.1128 Epoch 683/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1112 - mean_squared_error: 0.1112 Epoch 684/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1135 - mean_squared_error: 0.1135 Epoch 685/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1135 - mean_squared_error: 0.1135 Epoch 686/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1115 - mean_squared_error: 0.1115 Epoch 687/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1114 - mean_squared_error: 0.1114 Epoch 688/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1116 - mean_squared_error: 0.1116 Epoch 689/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1134 - mean_squared_error: 0.1134 Epoch 690/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1124 - mean_squared_error: 0.1124 Epoch 691/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1122 - mean_squared_error: 0.1122 Epoch 692/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1119 - mean_squared_error: 0.1119 Epoch 693/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1121 - mean_squared_error: 0.1121 Epoch 694/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1124 - mean_squared_error: 0.1124 Epoch 695/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1122 - mean_squared_error: 0.1122 Epoch 696/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1114 - mean_squared_error: 0.1114 Epoch 697/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1122 - mean_squared_error: 0.1122 Epoch 698/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1119 - mean_squared_error: 0.1119 Epoch 699/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1136 - mean_squared_error: 0.1136 Epoch 700/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1108 - mean_squared_error: 0.1108 Epoch 701/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1110 - mean_squared_error: 0.1110 Epoch 702/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1136 - mean_squared_error: 0.1136 Epoch 703/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1123 - mean_squared_error: 0.1123 Epoch 704/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1126 - mean_squared_error: 0.1126 Epoch 705/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1121 - mean_squared_error: 0.1121 Epoch 706/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1116 - mean_squared_error: 0.1116 Epoch 707/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1099 - mean_squared_error: 0.1099 Epoch 708/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1112 - mean_squared_error: 0.1112 Epoch 709/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1130 - mean_squared_error: 0.1130 Epoch 710/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1108 - mean_squared_error: 0.1108 Epoch 711/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1110 - mean_squared_error: 0.1110 Epoch 712/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1116 - mean_squared_error: 0.1116 Epoch 713/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1116 - mean_squared_error: 0.1116 Epoch 714/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1123 - mean_squared_error: 0.1123 Epoch 715/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1129 - mean_squared_error: 0.1129 Epoch 716/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1125 - mean_squared_error: 0.1125 Epoch 717/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1103 - mean_squared_error: 0.1103 Epoch 718/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1119 - mean_squared_error: 0.1119 Epoch 719/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1109 - mean_squared_error: 0.1109 Epoch 720/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1107 - mean_squared_error: 0.1107 Epoch 721/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1111 - mean_squared_error: 0.1111 Epoch 722/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1112 - mean_squared_error: 0.1112 Epoch 723/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1105 - mean_squared_error: 0.1105 Epoch 724/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1114 - mean_squared_error: 0.1114 Epoch 725/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1119 - mean_squared_error: 0.1119 Epoch 726/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1108 - mean_squared_error: 0.1108 Epoch 727/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1106 - mean_squared_error: 0.1106 Epoch 728/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1132 - mean_squared_error: 0.1132 Epoch 729/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1108 - mean_squared_error: 0.1108 Epoch 730/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1111 - mean_squared_error: 0.1111 Epoch 731/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1113 - mean_squared_error: 0.1113 Epoch 732/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1115 - mean_squared_error: 0.1115 Epoch 733/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1108 - mean_squared_error: 0.1108 Epoch 734/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1116 - mean_squared_error: 0.1116 Epoch 735/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1105 - mean_squared_error: 0.1105 Epoch 736/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1116 - mean_squared_error: 0.1116 Epoch 737/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1118 - mean_squared_error: 0.1118 Epoch 738/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1110 - mean_squared_error: 0.1110 Epoch 739/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1129 - mean_squared_error: 0.1129 Epoch 740/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1114 - mean_squared_error: 0.1114 Epoch 741/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1114 - mean_squared_error: 0.1114 Epoch 742/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1112 - mean_squared_error: 0.1112 Epoch 743/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1109 - mean_squared_error: 0.1109 Epoch 744/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1102 - mean_squared_error: 0.1102 Epoch 745/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1125 - mean_squared_error: 0.1125 Epoch 746/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1115 - mean_squared_error: 0.1115 Epoch 747/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1111 - mean_squared_error: 0.1111 Epoch 748/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1113 - mean_squared_error: 0.1113 Epoch 749/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1114 - mean_squared_error: 0.1114 Epoch 750/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1098 - mean_squared_error: 0.1098 Epoch 751/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1105 - mean_squared_error: 0.1105 Epoch 752/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1111 - mean_squared_error: 0.1111 Epoch 753/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1100 - mean_squared_error: 0.1100 Epoch 754/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1106 - mean_squared_error: 0.1106 Epoch 755/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1114 - mean_squared_error: 0.1114 Epoch 756/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1100 - mean_squared_error: 0.1100 Epoch 757/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1112 - mean_squared_error: 0.1112 Epoch 758/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1117 - mean_squared_error: 0.1117: 0s - loss: 0.1122 - mean_squared_error Epoch 759/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1104 - mean_squared_error: 0.1104 Epoch 760/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1103 - mean_squared_error: 0.1103 Epoch 761/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1110 - mean_squared_error: 0.1110 Epoch 762/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1111 - mean_squared_error: 0.1111 Epoch 763/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1109 - mean_squared_error: 0.1109 Epoch 764/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1110 - mean_squared_error: 0.1110 Epoch 765/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1115 - mean_squared_error: 0.1115 Epoch 766/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1117 - mean_squared_error: 0.1117 Epoch 767/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1106 - mean_squared_error: 0.1106 Epoch 768/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1109 - mean_squared_error: 0.1109 Epoch 769/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1112 - mean_squared_error: 0.1112 Epoch 770/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1118 - mean_squared_error: 0.1118 Epoch 771/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1102 - mean_squared_error: 0.1102 Epoch 772/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1111 - mean_squared_error: 0.1111 Epoch 773/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1104 - mean_squared_error: 0.1104 Epoch 774/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1109 - mean_squared_error: 0.1109 Epoch 775/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1099 - mean_squared_error: 0.1099 Epoch 776/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1108 - mean_squared_error: 0.1108 Epoch 777/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1104 - mean_squared_error: 0.1104 Epoch 778/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1115 - mean_squared_error: 0.1115 Epoch 779/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1107 - mean_squared_error: 0.1107 Epoch 780/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1127 - mean_squared_error: 0.1127 Epoch 781/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1096 - mean_squared_error: 0.1096 Epoch 782/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1096 - mean_squared_error: 0.1096 Epoch 783/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1105 - mean_squared_error: 0.1105 Epoch 784/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1117 - mean_squared_error: 0.1117 Epoch 785/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1109 - mean_squared_error: 0.1109 Epoch 786/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1110 - mean_squared_error: 0.1110 Epoch 787/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1103 - mean_squared_error: 0.1103 Epoch 788/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1121 - mean_squared_error: 0.1121 Epoch 789/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1109 - mean_squared_error: 0.1109 Epoch 790/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1116 - mean_squared_error: 0.1116 Epoch 791/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1099 - mean_squared_error: 0.1099 Epoch 792/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1094 - mean_squared_error: 0.1094 Epoch 793/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1110 - mean_squared_error: 0.1110 Epoch 794/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1127 - mean_squared_error: 0.1127 Epoch 795/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1105 - mean_squared_error: 0.1105 Epoch 796/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1101 - mean_squared_error: 0.1101 Epoch 797/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1112 - mean_squared_error: 0.1112 Epoch 798/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1111 - mean_squared_error: 0.1111 Epoch 799/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1100 - mean_squared_error: 0.1100 Epoch 800/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1101 - mean_squared_error: 0.1101 Epoch 801/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1107 - mean_squared_error: 0.1107 Epoch 802/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1108 - mean_squared_error: 0.1108 Epoch 803/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1093 - mean_squared_error: 0.1093 Epoch 804/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1113 - mean_squared_error: 0.1113 Epoch 805/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1109 - mean_squared_error: 0.1109 Epoch 806/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1104 - mean_squared_error: 0.1104 Epoch 807/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1100 - mean_squared_error: 0.1100 Epoch 808/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1101 - mean_squared_error: 0.1101 Epoch 809/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1100 - mean_squared_error: 0.1100 Epoch 810/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1098 - mean_squared_error: 0.1098 Epoch 811/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1101 - mean_squared_error: 0.1101 Epoch 812/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1112 - mean_squared_error: 0.1112 Epoch 813/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1117 - mean_squared_error: 0.1117 Epoch 814/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1106 - mean_squared_error: 0.1106 Epoch 815/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1091 - mean_squared_error: 0.1091 Epoch 816/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1093 - mean_squared_error: 0.1093 Epoch 817/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1115 - mean_squared_error: 0.1115 Epoch 818/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1101 - mean_squared_error: 0.1101 Epoch 819/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1095 - mean_squared_error: 0.1095 Epoch 820/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1101 - mean_squared_error: 0.1101 Epoch 821/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1105 - mean_squared_error: 0.1105 Epoch 822/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1099 - mean_squared_error: 0.1099 Epoch 823/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1105 - mean_squared_error: 0.1105 Epoch 824/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1105 - mean_squared_error: 0.1105 Epoch 825/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1097 - mean_squared_error: 0.1097 Epoch 826/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1098 - mean_squared_error: 0.1098 Epoch 827/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1102 - mean_squared_error: 0.1102 Epoch 828/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1109 - mean_squared_error: 0.1109 Epoch 829/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1091 - mean_squared_error: 0.1091 Epoch 830/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1097 - mean_squared_error: 0.1097 Epoch 831/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1105 - mean_squared_error: 0.1105 Epoch 832/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1098 - mean_squared_error: 0.1098 Epoch 833/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1104 - mean_squared_error: 0.1104 Epoch 834/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1097 - mean_squared_error: 0.1097 Epoch 835/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1090 - mean_squared_error: 0.1090 Epoch 836/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1101 - mean_squared_error: 0.1101 Epoch 837/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1110 - mean_squared_error: 0.1110 Epoch 838/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1094 - mean_squared_error: 0.1094 Epoch 839/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1099 - mean_squared_error: 0.1099 Epoch 840/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1106 - mean_squared_error: 0.1106 Epoch 841/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1120 - mean_squared_error: 0.1120 Epoch 842/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1109 - mean_squared_error: 0.1109 Epoch 843/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1084 - mean_squared_error: 0.1084 Epoch 844/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1118 - mean_squared_error: 0.1118 Epoch 845/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1111 - mean_squared_error: 0.1111 Epoch 846/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1100 - mean_squared_error: 0.1100 Epoch 847/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1110 - mean_squared_error: 0.1110 Epoch 848/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1097 - mean_squared_error: 0.1097 Epoch 849/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1118 - mean_squared_error: 0.1118 Epoch 850/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1098 - mean_squared_error: 0.1098 Epoch 851/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1092 - mean_squared_error: 0.1092 Epoch 852/1000 403/403 [==============================] - 1s 2ms/step - loss: 0.1101 - mean_squared_error: 0.1101 Epoch 853/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1094 - mean_squared_error: 0.1094 Epoch 854/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1102 - mean_squared_error: 0.1102 Epoch 855/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1109 - mean_squared_error: 0.1109 Epoch 856/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1093 - mean_squared_error: 0.1093 Epoch 857/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1115 - mean_squared_error: 0.1115: 0s - loss: 0.1125 - mean_squared_error: 0. Epoch 858/1000 403/403 [==============================] - 1s 2ms/step - loss: 0.1109 - mean_squared_error: 0.1109 Epoch 859/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1099 - mean_squared_error: 0.1099 Epoch 860/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1095 - mean_squared_error: 0.1095 Epoch 861/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1089 - mean_squared_error: 0.1089 Epoch 862/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1101 - mean_squared_error: 0.1101 Epoch 863/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1103 - mean_squared_error: 0.1103 Epoch 864/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1104 - mean_squared_error: 0.1104 Epoch 865/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1095 - mean_squared_error: 0.1095 Epoch 866/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1102 - mean_squared_error: 0.1102 Epoch 867/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1118 - mean_squared_error: 0.1118 Epoch 868/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1100 - mean_squared_error: 0.1100 Epoch 869/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1091 - mean_squared_error: 0.1091 Epoch 870/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1114 - mean_squared_error: 0.1114 Epoch 871/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1113 - mean_squared_error: 0.1113 Epoch 872/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1108 - mean_squared_error: 0.1108 Epoch 873/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1090 - mean_squared_error: 0.1090 Epoch 874/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1105 - mean_squared_error: 0.1105 Epoch 875/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1100 - mean_squared_error: 0.1100 Epoch 876/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1093 - mean_squared_error: 0.1093 Epoch 877/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1108 - mean_squared_error: 0.1108 Epoch 878/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1097 - mean_squared_error: 0.1097 Epoch 879/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1088 - mean_squared_error: 0.1088 Epoch 880/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1138 - mean_squared_error: 0.1138 Epoch 881/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1101 - mean_squared_error: 0.1101 Epoch 882/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1084 - mean_squared_error: 0.1084 Epoch 883/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1091 - mean_squared_error: 0.1091 Epoch 884/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1092 - mean_squared_error: 0.1092 Epoch 885/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1106 - mean_squared_error: 0.1106 Epoch 886/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1100 - mean_squared_error: 0.1100 Epoch 887/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1102 - mean_squared_error: 0.1102 Epoch 888/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1095 - mean_squared_error: 0.1095 Epoch 889/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1093 - mean_squared_error: 0.1093 Epoch 890/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1087 - mean_squared_error: 0.1087 Epoch 891/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1108 - mean_squared_error: 0.1108 Epoch 892/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1097 - mean_squared_error: 0.1097 Epoch 893/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1091 - mean_squared_error: 0.1091 Epoch 894/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1093 - mean_squared_error: 0.1093 Epoch 895/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1091 - mean_squared_error: 0.1091 Epoch 896/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1098 - mean_squared_error: 0.1098 Epoch 897/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1087 - mean_squared_error: 0.1087 Epoch 898/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1097 - mean_squared_error: 0.1097 Epoch 899/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1095 - mean_squared_error: 0.1095 Epoch 900/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1091 - mean_squared_error: 0.1091 Epoch 901/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1091 - mean_squared_error: 0.1091 Epoch 902/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1091 - mean_squared_error: 0.1091 Epoch 903/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1101 - mean_squared_error: 0.1101 Epoch 904/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1084 - mean_squared_error: 0.1084 Epoch 905/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1095 - mean_squared_error: 0.1095 Epoch 906/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1141 - mean_squared_error: 0.1141 Epoch 907/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1096 - mean_squared_error: 0.1096 Epoch 908/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1099 - mean_squared_error: 0.1099 Epoch 909/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1088 - mean_squared_error: 0.1088 Epoch 910/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1084 - mean_squared_error: 0.1084 Epoch 911/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1090 - mean_squared_error: 0.1090 Epoch 912/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1096 - mean_squared_error: 0.1096 Epoch 913/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1095 - mean_squared_error: 0.1095 Epoch 914/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1120 - mean_squared_error: 0.1120 Epoch 915/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1092 - mean_squared_error: 0.1092 Epoch 916/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1090 - mean_squared_error: 0.1090 Epoch 917/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1091 - mean_squared_error: 0.1091 Epoch 918/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1103 - mean_squared_error: 0.1103 Epoch 919/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1089 - mean_squared_error: 0.1089 Epoch 920/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1090 - mean_squared_error: 0.1090 Epoch 921/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1087 - mean_squared_error: 0.1087 Epoch 922/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1105 - mean_squared_error: 0.1105 Epoch 923/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1098 - mean_squared_error: 0.1098 Epoch 924/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1096 - mean_squared_error: 0.1096 Epoch 925/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1087 - mean_squared_error: 0.1087 Epoch 926/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1097 - mean_squared_error: 0.1097 Epoch 927/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1098 - mean_squared_error: 0.1098 Epoch 928/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1086 - mean_squared_error: 0.1086 Epoch 929/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1089 - mean_squared_error: 0.1089 Epoch 930/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1107 - mean_squared_error: 0.1107 Epoch 931/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1084 - mean_squared_error: 0.1084 Epoch 932/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1101 - mean_squared_error: 0.1101 Epoch 933/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1084 - mean_squared_error: 0.1084 Epoch 934/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1087 - mean_squared_error: 0.1087 Epoch 935/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1096 - mean_squared_error: 0.1096 Epoch 936/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1102 - mean_squared_error: 0.1102 Epoch 937/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1099 - mean_squared_error: 0.1099 Epoch 938/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1080 - mean_squared_error: 0.1080 Epoch 939/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1096 - mean_squared_error: 0.1096 Epoch 940/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1096 - mean_squared_error: 0.1096 Epoch 941/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1091 - mean_squared_error: 0.1091 Epoch 942/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1088 - mean_squared_error: 0.1088 Epoch 943/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1089 - mean_squared_error: 0.1089 Epoch 944/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1079 - mean_squared_error: 0.1079 Epoch 945/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1091 - mean_squared_error: 0.1091 Epoch 946/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1098 - mean_squared_error: 0.1098: 0s - loss: 0.1100 - mean_squared_error Epoch 947/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1085 - mean_squared_error: 0.1085 Epoch 948/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1074 - mean_squared_error: 0.1074 Epoch 949/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1097 - mean_squared_error: 0.1097 Epoch 950/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1103 - mean_squared_error: 0.1103 Epoch 951/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1085 - mean_squared_error: 0.1085 Epoch 952/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1079 - mean_squared_error: 0.1079 Epoch 953/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1095 - mean_squared_error: 0.1095 Epoch 954/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1085 - mean_squared_error: 0.1085 Epoch 955/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1085 - mean_squared_error: 0.1085 Epoch 956/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1092 - mean_squared_error: 0.1092 Epoch 957/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1091 - mean_squared_error: 0.1091 Epoch 958/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1093 - mean_squared_error: 0.1093 Epoch 959/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1100 - mean_squared_error: 0.1100 Epoch 960/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1093 - mean_squared_error: 0.1093 Epoch 961/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1093 - mean_squared_error: 0.1093 Epoch 962/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1090 - mean_squared_error: 0.1090 Epoch 963/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1092 - mean_squared_error: 0.1092 Epoch 964/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1109 - mean_squared_error: 0.1109 Epoch 965/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1088 - mean_squared_error: 0.1088 Epoch 966/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1074 - mean_squared_error: 0.1074 Epoch 967/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1093 - mean_squared_error: 0.1093 Epoch 968/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1084 - mean_squared_error: 0.1084 Epoch 969/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1086 - mean_squared_error: 0.1086 Epoch 970/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1085 - mean_squared_error: 0.1085 Epoch 971/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1079 - mean_squared_error: 0.1079 Epoch 972/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1108 - mean_squared_error: 0.1108 Epoch 973/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1108 - mean_squared_error: 0.1108 Epoch 974/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1086 - mean_squared_error: 0.1086 Epoch 975/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1084 - mean_squared_error: 0.1084 Epoch 976/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1088 - mean_squared_error: 0.1088 Epoch 977/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1080 - mean_squared_error: 0.1080 Epoch 978/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1103 - mean_squared_error: 0.1103 Epoch 979/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1092 - mean_squared_error: 0.1092 Epoch 980/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1091 - mean_squared_error: 0.1091 Epoch 981/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1087 - mean_squared_error: 0.1087 Epoch 982/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1094 - mean_squared_error: 0.1094 Epoch 983/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1085 - mean_squared_error: 0.1085 Epoch 984/1000 403/403 [==============================] - 1s 2ms/step - loss: 0.1075 - mean_squared_error: 0.1075 Epoch 985/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1086 - mean_squared_error: 0.1086 Epoch 986/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1097 - mean_squared_error: 0.1097 Epoch 987/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1102 - mean_squared_error: 0.1102 Epoch 988/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1083 - mean_squared_error: 0.1083 Epoch 989/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1082 - mean_squared_error: 0.1082 Epoch 990/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1099 - mean_squared_error: 0.1099 Epoch 991/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1096 - mean_squared_error: 0.1096 Epoch 992/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1095 - mean_squared_error: 0.1095 Epoch 993/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1083 - mean_squared_error: 0.1083 Epoch 994/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1097 - mean_squared_error: 0.1097 Epoch 995/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1094 - mean_squared_error: 0.1094 Epoch 996/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1084 - mean_squared_error: 0.1084 Epoch 997/1000 403/403 [==============================] - 0s 1ms/step - loss: 0.1107 - mean_squared_error: 0.1107 Epoch 998/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1090 - mean_squared_error: 0.1090 Epoch 999/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1086 - mean_squared_error: 0.1086 Epoch 1000/1000 403/403 [==============================] - 1s 1ms/step - loss: 0.1092 - mean_squared_error: 0.1092
#predict
y_pred_test = model.predict(X_test)
y_pred_train = model.predict(X_train)
Relationship between predictions and actual values graphically with a scatter plot
plt.figure(figsize=(10,10))
plt.scatter(y_test, y_pred_test, alpha = 0.5,color='#FF8080')
plt.xlabel("True Value",fontsize=15)
plt.ylabel("Prediction",fontsize=15)
plt.title('Price Prediction by Neural Network',fontsize=20)
# diagnal
plt.plot(y_test, y_test, ls="--", c=".3")
plt.show()
print("Training RMSE:", np.sqrt(metrics.mean_squared_error(y_train, y_pred_train)))
print("Validation RMSE:",np.sqrt(metrics.mean_squared_error(y_test, y_pred_test)))
print("\nTraining MSE:", metrics.mean_squared_error(y_train, y_pred_train))
print("Validation MSE:", metrics.mean_squared_error(y_test, y_pred_test))
print("\nTraining r2:", metrics.r2_score(y_train, y_pred_train))
print("Validation r2:", metrics.r2_score(y_test, y_pred_test))
Training RMSE: 0.32502962917964606 Validation RMSE: 0.4063788823813931 Training MSE: 0.10564425984465822 Validation MSE: 0.16514379604555013 Training r2: 0.6244954322499268 Validation r2: 0.40143865997064687
Plot the loss changes during training, to find whether the model converged.
plt.figure(figsize=(10,10))
plt.plot(history.history['loss'][2:],color='#FF8080')
plt.title('Model loss',fontsize = 20)
plt.ylabel('Loss',fontsize = 15)
plt.xlabel('Epoch',fontsize = 15)
plt.show()
We choose the Random Forest model as the final prediction model, because its RMSE is the smallest, which means that the prediction error is the smallest. For neural network models, more complex structures and parameters can be further adjusted in the future.
One advantage of using decision tree methods is that they can automatically provide an estimate of feature importance from the trained prediction model.
The importance indicates the value of each feature when building an enhanced decision tree in the model. The more times the attribute makes key decisions in the decision tree, the higher its relative importance.
x = pd.DataFrame(reg_rf.feature_importances_, columns=['weight'], index=X_train.columns)
# sort by value
x.sort_values('weight', ascending=False, inplace=True)
x.plot.bar(figsize=(30,20),fontsize=15, color=['#ff8080'])
plt.title('Feature Importance',fontsize=40)
plt.ylabel('Weight', fontsize=30)
plt.show()
After analyzing the Airbnb data of Amsterdam, we can say that a higher number of amenities increases the probability of the listing being accepted at a higher price. Among them, we identified which amenities have the most influence on pricing. We did a customer review analysis to identify the factors (cleanliness, location, value for money) that the host can improve on to gain a competitive advantage, and also the zones they can target to maximize the profits from the listings along with increasing the overall occupancy rate. Finally, after considering all of these, we compared multiple price prediction models and selected the Random Forest model as the final prediction model, because its RMSE is the smallest, which means that the prediction error is the smallest. This prediction model can be vital for the hosts determine an optimal price for their listings.